Apr 22 18:43:35.739114 ip-10-0-132-198 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:43:36.210952 ip-10-0-132-198 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:36.210952 ip-10-0-132-198 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:43:36.210952 ip-10-0-132-198 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:36.210952 ip-10-0-132-198 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:43:36.210952 ip-10-0-132-198 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:36.214473 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.214381 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224268 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224294 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224297 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224302 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224307 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:36.224299 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224311 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224314 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224319 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224322 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224326 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224330 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224333 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224335 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224338 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224340 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224343 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224346 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224348 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224351 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224353 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224356 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224359 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224361 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224364 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224366 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:36.224569 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224370 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224373 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224375 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224378 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224380 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224383 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224386 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224388 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224391 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224393 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224397 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224400 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224403 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224406 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224410 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224413 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224416 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224418 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224421 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224424 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:36.225097 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224427 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224430 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224433 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224435 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224438 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224440 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224443 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224445 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224448 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224451 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224455 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224458 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224461 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224464 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224466 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224469 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224472 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224474 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224477 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:36.225597 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224480 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224483 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224485 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224488 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224490 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224493 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224496 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224499 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224501 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224504 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224506 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224509 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224511 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224514 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224518 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224521 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224523 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224526 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224528 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224531 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:36.226073 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224533 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.224536 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225010 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225017 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225020 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225023 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225026 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225028 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225031 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225034 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225036 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225039 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225044 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225048 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225051 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225054 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225057 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225060 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225063 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225067 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:36.226554 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225070 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225073 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225076 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225079 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225081 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225084 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225087 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225090 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225093 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225095 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225098 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225101 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225105 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225108 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225111 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225114 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225117 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225120 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225122 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:36.227134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225125 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225128 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225130 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225133 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225135 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225138 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225140 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225143 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225146 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225148 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225152 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225154 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225157 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225160 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225163 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225165 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225168 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225170 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225173 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225176 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:36.227624 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225178 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225181 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225183 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225186 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225189 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225191 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225194 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225196 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225200 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225202 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225205 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225207 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225210 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225214 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225216 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225219 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225221 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225224 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225226 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:36.228134 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225229 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225231 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225234 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225236 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225239 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225242 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225245 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225247 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225250 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.225252 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226017 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226028 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226035 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226040 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226046 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226049 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226054 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226059 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226062 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226065 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226069 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226073 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:43:36.228601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226077 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226080 2569 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226083 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226086 2569 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226089 2569 flags.go:64] FLAG: --cloud-config="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226092 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226095 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226100 2569 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226104 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226107 2569 flags.go:64] FLAG: --config-dir="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226110 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226113 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226118 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226121 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226124 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226128 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226131 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226134 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226137 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226140 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226144 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226148 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226151 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226154 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226157 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:43:36.229162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226162 2569 flags.go:64] FLAG: --enable-server="true" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226165 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226169 2569 flags.go:64] FLAG: --event-burst="100" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226173 2569 flags.go:64] FLAG: --event-qps="50" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226176 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226179 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226183 2569 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226187 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226190 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226193 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226197 2569 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226200 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226203 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226206 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226209 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226212 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226216 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226218 2569 flags.go:64] FLAG: --feature-gates="" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226222 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226225 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226229 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226232 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226235 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226238 2569 flags.go:64] FLAG: --help="false" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226241 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.229794 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226244 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226247 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226250 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226253 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226256 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226259 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226262 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226265 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226269 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226272 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226275 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226278 2569 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226281 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226284 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226287 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226290 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226293 2569 flags.go:64] FLAG: --lock-file="" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226295 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226298 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226301 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226307 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226310 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226313 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:43:36.230438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226316 2569 flags.go:64] FLAG: --logging-format="text" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226319 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226322 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226325 2569 flags.go:64] FLAG: --manifest-url="" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226328 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226333 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226336 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226340 2569 flags.go:64] FLAG: --max-pods="110" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226343 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226346 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226349 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226352 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226355 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226358 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226361 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226369 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226372 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226375 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226379 2569 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226382 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226388 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226391 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226394 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226403 2569 flags.go:64] FLAG: --port="10250" Apr 22 18:43:36.231023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226406 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226409 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04fc6a157e6faa13c" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226412 2569 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226415 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226418 2569 flags.go:64] FLAG: --register-node="true" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226421 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226425 2569 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226428 2569 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226431 2569 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226434 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226437 2569 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226441 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226444 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226447 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226450 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226453 2569 flags.go:64] FLAG: --runonce="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226456 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226459 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226462 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226465 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226468 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226471 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226474 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226477 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226480 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226483 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:43:36.231638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226486 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226489 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226492 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226495 2569 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226498 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226504 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226507 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226510 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226515 2569 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226518 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226521 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226524 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226529 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226532 2569 flags.go:64] FLAG: --v="2" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226536 2569 flags.go:64] FLAG: --version="false" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226546 2569 flags.go:64] FLAG: --vmodule="" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226550 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.226553 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226654 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226658 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226662 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226665 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226668 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:36.232294 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226672 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226675 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226678 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226681 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226684 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226687 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226690 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226692 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226695 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226697 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226700 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226703 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226706 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226721 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226726 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226729 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226732 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226734 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226737 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226740 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:36.232878 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226742 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226747 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226749 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226752 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226754 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226758 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226760 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226763 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226766 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226769 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226771 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226774 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226777 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226780 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226782 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226785 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226790 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226793 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226796 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226799 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:36.233438 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226802 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226804 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226807 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226810 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226813 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226816 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226820 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226823 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226825 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226828 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226831 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226833 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226836 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226839 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226842 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226844 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226847 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226849 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226852 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226854 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:36.233967 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226857 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226860 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226862 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226865 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226867 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226870 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226873 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226875 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226878 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226881 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226883 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226886 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226889 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226891 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226894 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226896 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226899 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226912 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226917 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226919 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:36.234471 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.226922 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:36.234982 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.227851 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:36.236522 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.236492 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:43:36.236522 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.236521 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236600 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236606 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236609 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236612 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236614 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236617 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236620 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:36.236619 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236622 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236625 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236629 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236633 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236635 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236638 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236641 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236644 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236647 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236650 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236652 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236655 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236659 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236661 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236664 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236667 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236669 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236672 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236675 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236677 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:36.236847 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236680 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236682 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236685 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236687 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236692 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236697 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236701 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236704 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236728 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236731 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236733 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236736 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236739 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236742 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236746 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236749 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236752 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236755 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236758 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:36.237356 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236761 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236763 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236766 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236768 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236772 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236774 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236777 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236779 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236782 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236785 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236787 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236790 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236792 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236795 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236797 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236800 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236802 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236805 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236808 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236810 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:36.237844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236813 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236815 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236818 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236821 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236824 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236826 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236829 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236832 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236835 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236837 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236840 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236842 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236845 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236848 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236851 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236854 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236857 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236860 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236863 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:36.238377 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.236866 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.236871 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237013 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237018 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237022 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237025 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237028 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237031 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237034 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237037 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237040 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237042 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237046 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237048 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237051 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237054 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:36.238864 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237056 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237059 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237061 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237064 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237067 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237070 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237073 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237076 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237079 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237081 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237084 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237086 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237089 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237092 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237094 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237098 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237102 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237105 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237107 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237110 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:36.239262 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237113 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237115 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237118 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237120 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237123 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237125 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237128 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237130 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237133 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237136 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237138 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237141 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237144 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237146 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237149 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237152 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237154 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237158 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237160 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237163 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:36.239792 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237166 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237168 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237171 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237173 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237176 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237178 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237180 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237184 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237187 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237190 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237193 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237196 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237199 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237202 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237205 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237208 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237211 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237214 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237216 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:36.240295 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237219 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237222 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237224 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237227 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237229 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237232 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237234 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237236 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237240 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237242 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237245 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237248 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:36.237250 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.237254 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:36.240980 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.238026 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:43:36.242236 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.242220 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:43:36.243270 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.243256 2569 server.go:1019] "Starting client certificate rotation" Apr 22 18:43:36.243425 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.243403 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:36.243479 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.243457 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:36.270867 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.270833 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:36.277098 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.277065 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:36.291242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.291215 2569 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:43:36.296986 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.296955 2569 log.go:25] "Validated CRI v1 image API" Apr 22 18:43:36.298408 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.298385 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:43:36.304351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.304293 2569 fs.go:135] Filesystem UUIDs: map[4950e045-581b-44c8-a914-5ffd297e1bed:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 95e4839b-a2c7-45f8-a5ed-dbeb92beaa29:/dev/nvme0n1p3] Apr 22 18:43:36.304351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.304339 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:36.304487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.304349 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:43:36.309989 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.309866 2569 manager.go:217] Machine: {Timestamp:2026-04-22 18:43:36.308202321 +0000 UTC m=+0.445546729 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100996 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24ff46e48e6ce28986c666e87723c7 SystemUUID:ec24ff46-e48e-6ce2-8986-c666e87723c7 BootID:5021d28e-670c-4125-9188-bd176230e74c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:ae:e5:bf:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:ae:e5:bf:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:9d:a3:89:68:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:43:36.309989 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.309979 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:43:36.310111 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.310071 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:43:36.312120 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.312093 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:43:36.312285 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.312121 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:43:36.312331 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.312298 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:43:36.312331 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.312307 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:43:36.312331 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.312320 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:36.313195 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.313184 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:36.314638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.314626 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:36.314785 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.314776 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:43:36.317449 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.317437 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:43:36.317487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.317460 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:43:36.317487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.317473 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:43:36.317487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.317483 2569 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:43:36.317593 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.317499 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:43:36.318603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.318591 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:36.318654 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.318610 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:36.321771 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.321732 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nr8rl" Apr 22 18:43:36.322777 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.322756 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:43:36.324695 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.324675 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:43:36.326819 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326805 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326826 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326835 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326843 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326851 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326859 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326869 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326878 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326888 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326897 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:43:36.326906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326910 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:43:36.327191 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.326923 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:43:36.328023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.328013 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:43:36.328071 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.328026 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:43:36.330818 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.330795 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nr8rl" Apr 22 18:43:36.331067 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.331047 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:43:36.331130 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.331061 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:43:36.331130 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.331061 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:43:36.331928 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.331912 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:43:36.331996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.331960 2569 server.go:1295] "Started kubelet" Apr 22 18:43:36.332260 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.332217 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:43:36.332260 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.332220 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:43:36.332372 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.332286 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:43:36.333038 ip-10-0-132-198 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:43:36.334056 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.334041 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:43:36.335349 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.335334 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:43:36.341384 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.341362 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:36.341890 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.341869 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:43:36.343105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343086 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:43:36.343105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343093 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:43:36.343231 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343114 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:43:36.343231 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343183 2569 factory.go:153] Registering CRI-O factory Apr 22 18:43:36.343231 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343206 2569 factory.go:223] Registration of the crio container factory successfully Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343260 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343269 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343259 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343287 2569 factory.go:55] Registering systemd factory Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343297 2569 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343318 2569 factory.go:103] Registering Raw factory Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.343333 2569 manager.go:1196] Started watching for new ooms in manager Apr 22 18:43:36.343376 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.343336 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:43:36.343813 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.343792 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.344041 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.344018 2569 manager.go:319] Starting recovery of all containers Apr 22 18:43:36.346216 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.346190 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:36.349645 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.349617 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-198.ec2.internal\" not found" node="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.353904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.353702 2569 manager.go:324] Recovery completed Apr 22 18:43:36.358846 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.358828 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.361734 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.361700 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.361813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.361748 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.361813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.361764 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.362296 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.362281 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:43:36.362361 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.362297 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:43:36.362361 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.362319 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:36.365053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.365039 2569 policy_none.go:49] "None policy: Start" Apr 22 18:43:36.365143 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.365059 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:43:36.365143 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.365073 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:43:36.408012 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.407992 2569 manager.go:341] "Starting Device Plugin manager" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.408049 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408063 2569 server.go:85] "Starting device plugin registration server" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408354 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408367 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408475 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408561 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.408571 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.409090 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:43:36.414665 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.409125 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.446521 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.446471 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:43:36.447819 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.447801 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:43:36.447948 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.447832 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:43:36.447948 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.447862 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:43:36.447948 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.447872 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:43:36.447948 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.447912 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:43:36.451598 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.451575 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:36.509031 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.508952 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.510377 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.510360 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.510462 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.510395 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.510462 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.510410 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.510462 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.510437 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.521318 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.521293 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.521377 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.521323 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-198.ec2.internal\": node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.548173 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.548136 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal"] Apr 22 18:43:36.548344 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.548208 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.549706 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.549688 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.549816 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.549742 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.549816 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.549757 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.550988 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.550969 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.551125 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.551111 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.551171 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.551137 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.551475 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.551461 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.553210 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553187 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.553210 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553211 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.553356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553193 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.553356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553243 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.553356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553256 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.553356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.553221 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.554508 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.554491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.554590 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.554525 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:36.555274 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.555258 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:36.555358 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.555287 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:36.555358 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.555301 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:36.588538 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.588513 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-198.ec2.internal\" not found" node="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.593155 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.593137 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-198.ec2.internal\" not found" node="ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.645112 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.645080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d715d14c3e64c4935a99ec8a0e8eebfa-config\") pod \"kube-apiserver-proxy-ip-10-0-132-198.ec2.internal\" (UID: \"d715d14c3e64c4935a99ec8a0e8eebfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.645112 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.645114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.645312 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.645139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.652498 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.652475 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.745307 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d715d14c3e64c4935a99ec8a0e8eebfa-config\") pod \"kube-apiserver-proxy-ip-10-0-132-198.ec2.internal\" (UID: \"d715d14c3e64c4935a99ec8a0e8eebfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.745431 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.745431 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.745431 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.745431 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/689f34861518d12cf2af6cfcea46e98b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal\" (UID: \"689f34861518d12cf2af6cfcea46e98b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.745567 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.745458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d715d14c3e64c4935a99ec8a0e8eebfa-config\") pod \"kube-apiserver-proxy-ip-10-0-132-198.ec2.internal\" (UID: \"d715d14c3e64c4935a99ec8a0e8eebfa\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.753396 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.753373 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.853802 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.853761 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:36.890991 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.890963 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.896548 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:36.896529 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:36.954731 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:36.954678 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.055238 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.055209 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.155818 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.155727 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.242988 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.242941 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:43:37.243512 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.243106 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:43:37.243512 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.243144 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:43:37.256342 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.256307 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.333459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.333419 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:38:36 +0000 UTC" deadline="2028-01-06 14:28:28.29761281 +0000 UTC" Apr 22 18:43:37.333459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.333448 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14971h44m50.964167576s" Apr 22 18:43:37.341904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.341553 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:37.353648 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.353620 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:37.357192 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.357169 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.370769 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.370740 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:37.395694 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:37.395649 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd715d14c3e64c4935a99ec8a0e8eebfa.slice/crio-4ca7f3ed1a45e111337648f1a14149fd734269550b1978a8cf0b789cc963ef73 WatchSource:0}: Error finding container 4ca7f3ed1a45e111337648f1a14149fd734269550b1978a8cf0b789cc963ef73: Status 404 returned error can't find the container with id 4ca7f3ed1a45e111337648f1a14149fd734269550b1978a8cf0b789cc963ef73 Apr 22 18:43:37.396384 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:37.396357 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod689f34861518d12cf2af6cfcea46e98b.slice/crio-09793089c889dbe530864e22eef40ff5a136ea793a3e20e1f3b9094c156166f0 WatchSource:0}: Error finding container 09793089c889dbe530864e22eef40ff5a136ea793a3e20e1f3b9094c156166f0: Status 404 returned error can't find the container with id 09793089c889dbe530864e22eef40ff5a136ea793a3e20e1f3b9094c156166f0 Apr 22 18:43:37.400902 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.400885 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:37.415464 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.415386 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m7llf" Apr 22 18:43:37.428565 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.428538 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m7llf" Apr 22 18:43:37.451278 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.451220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" event={"ID":"d715d14c3e64c4935a99ec8a0e8eebfa","Type":"ContainerStarted","Data":"4ca7f3ed1a45e111337648f1a14149fd734269550b1978a8cf0b789cc963ef73"} Apr 22 18:43:37.452301 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.452280 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" event={"ID":"689f34861518d12cf2af6cfcea46e98b","Type":"ContainerStarted","Data":"09793089c889dbe530864e22eef40ff5a136ea793a3e20e1f3b9094c156166f0"} Apr 22 18:43:37.457436 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.457419 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.558035 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.557991 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.658532 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.658487 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.759124 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:37.759031 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-198.ec2.internal\" not found" Apr 22 18:43:37.767099 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.767066 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:37.842682 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.842649 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" Apr 22 18:43:37.856272 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.856236 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:37.857228 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.857203 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" Apr 22 18:43:37.868463 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:37.868435 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:38.318908 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.318870 2569 apiserver.go:52] "Watching apiserver" Apr 22 18:43:38.329639 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.329610 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:43:38.330035 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.330004 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-k7k5d","openshift-network-operator/iptables-alerter-lr9rd","kube-system/konnectivity-agent-7c2jq","kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal","openshift-cluster-node-tuning-operator/tuned-n2nr9","openshift-image-registry/node-ca-vckqs","openshift-ovn-kubernetes/ovnkube-node-zlsv5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj","openshift-dns/node-resolver-2dk4j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal","openshift-multus/multus-additional-cni-plugins-27fs7","openshift-multus/multus-w4nw5","openshift-multus/network-metrics-daemon-srr7v"] Apr 22 18:43:38.333006 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.332975 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.334043 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.334019 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.334151 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.334103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.335176 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.335154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:38.335274 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.335229 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:38.336292 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336272 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.336654 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336635 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.336654 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336645 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:43:38.336835 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336810 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.336890 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336860 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:43:38.336937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.336886 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:43:38.337420 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.337400 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.337845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.337821 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:43:38.337961 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.337936 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.338040 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.338016 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.338094 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.337941 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qkfkn\"" Apr 22 18:43:38.338276 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.338254 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:43:38.338642 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.338471 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-82r49\"" Apr 22 18:43:38.338642 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.338538 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jqjm5\"" Apr 22 18:43:38.339460 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.339440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.339887 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.339776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.340145 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.340125 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wl2jm\"" Apr 22 18:43:38.340989 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.340969 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.342767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342473 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.342767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342524 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.342767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342608 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:43:38.342974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342817 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:43:38.342974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342848 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:43:38.342974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342878 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.342974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342885 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.342974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342938 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.343193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.342986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4rbfm\"" Apr 22 18:43:38.343937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.343909 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.344837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.344560 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sdbzg\"" Apr 22 18:43:38.344837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.344590 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:43:38.344837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.344803 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-7mrr7\"" Apr 22 18:43:38.344837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.344830 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.345168 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.345146 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:43:38.345501 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.345278 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.345501 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.345405 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:43:38.347833 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.347809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.347940 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.347869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.347940 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.347895 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:38.348744 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.348697 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:43:38.348845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.348754 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:43:38.348983 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.348941 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bdh6\"" Apr 22 18:43:38.354256 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354230 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:43:38.354504 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354478 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:43:38.354632 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354615 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t2cwl\"" Apr 22 18:43:38.354697 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-lib-modules\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.354777 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.354777 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354743 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baf1611e-6531-40fa-9e20-fab0967f4575-agent-certs\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.354777 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354768 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-systemd\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.354929 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354792 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a4a7ddf-880f-4094-a270-f8e491751768-ovn-node-metrics-cert\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.354929 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354841 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-daemon-config\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.354929 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rhp\" (UniqueName: \"kubernetes.io/projected/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-kube-api-access-l9rhp\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.354929 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kz44\" (UniqueName: \"kubernetes.io/projected/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-kube-api-access-4kz44\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.355108 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:38.355108 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc5wx\" (UniqueName: \"kubernetes.io/projected/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-kube-api-access-tc5wx\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.355108 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.354985 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-kubelet\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355080 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-systemd-units\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-bin\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355370 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-env-overrides\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355442 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-var-lib-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355442 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-etc-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355502 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355464 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-conf-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.355536 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.355536 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-sys\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.355618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355538 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-tuned\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.355618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355552 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-socket-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.355618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-host-slash\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.355618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-node-log\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-iptables-alerter-script\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355621 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-run\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355655 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-var-lib-kubelet\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jztd\" (UniqueName: \"kubernetes.io/projected/1a4a7ddf-880f-4094-a270-f8e491751768-kube-api-access-2jztd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355735 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-etc-kubernetes\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355760 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-tmp-dir\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-device-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-sys-fs\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.355817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-kubernetes\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-netns\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355849 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355894 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355927 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-bin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.355967 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-multus\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356001 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-kubelet\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356351 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-hostroot\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-modprobe-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356478 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356396 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-slash\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356478 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-registration-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.356478 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356443 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvxc\" (UniqueName: \"kubernetes.io/projected/3ddb91ad-5ce3-46af-a515-0e68caddf824-kube-api-access-cqvxc\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.356478 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-multus-certs\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356495 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baf1611e-6531-40fa-9e20-fab0967f4575-konnectivity-ca\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj9h2\" (UniqueName: \"kubernetes.io/projected/0d277017-e970-494a-84b8-a3c2be9bbf85-kube-api-access-tj9h2\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356527 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-os-release\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356554 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqwdj\" (UniqueName: \"kubernetes.io/projected/1d694bc1-10aa-4951-95ba-8ec4878fea46-kube-api-access-tqwdj\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysconfig\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356598 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-log-socket\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356620 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-script-lib\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356642 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-netns\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-hosts-file\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356751 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-conf\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-tmp\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356790 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-netd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-system-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-ovn\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356840 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-config\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356862 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d694bc1-10aa-4951-95ba-8ec4878fea46-host\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356896 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d694bc1-10aa-4951-95ba-8ec4878fea46-serviceca\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.356978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.356977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-systemd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357050 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cni-binary-copy\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-socket-dir-parent\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-k8s-cni-cncf-io\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357141 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-host\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57kv\" (UniqueName: \"kubernetes.io/projected/6edae425-482e-4c41-a7dd-25e8579fb7cd-kube-api-access-f57kv\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357195 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.357413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.357213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cnibin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.429601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.429540 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:37 +0000 UTC" deadline="2028-02-08 06:58:48.355559126 +0000 UTC" Apr 22 18:43:38.429601 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.429593 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15756h15m9.925970455s" Apr 22 18:43:38.444285 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.444249 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:43:38.457694 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-systemd\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.457694 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457704 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a4a7ddf-880f-4094-a270-f8e491751768-ovn-node-metrics-cert\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457822 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-systemd\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-daemon-config\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rhp\" (UniqueName: \"kubernetes.io/projected/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-kube-api-access-l9rhp\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kz44\" (UniqueName: \"kubernetes.io/projected/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-kube-api-access-4kz44\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457924 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc5wx\" (UniqueName: \"kubernetes.io/projected/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-kube-api-access-tc5wx\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.457949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-kubelet\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.457974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-systemd-units\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458025 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-systemd-units\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-kubelet\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-bin\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-env-overrides\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-var-lib-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458236 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-etc-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458263 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-conf-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-bin\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-sys\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458290 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-etc-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458349 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-tuned\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-conf-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-socket-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-daemon-config\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458427 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-host-slash\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-sys\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.458534 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-node-log\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.458573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-iptables-alerter-script\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-run\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-var-lib-kubelet\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-socket-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jztd\" (UniqueName: \"kubernetes.io/projected/1a4a7ddf-880f-4094-a270-f8e491751768-kube-api-access-2jztd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458677 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-host-slash\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458700 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-etc-kubernetes\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458726 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-env-overrides\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-run\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-node-log\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458743 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-tmp-dir\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458805 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-etc-kubernetes\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-var-lib-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-os-release\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458856 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-device-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.459351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-var-lib-kubelet\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-sys-fs\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458913 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgnp7\" (UniqueName: \"kubernetes.io/projected/100cdeaa-300c-4323-8dd0-f6d9ca2702af-kube-api-access-jgnp7\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458940 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-kubernetes\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-device-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-netns\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.458992 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-netns\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-kubernetes\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-sys-fs\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.459093 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:38.959056395 +0000 UTC m=+3.096400802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459091 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-tmp-dir\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-bin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-multus\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-kubelet\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459221 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-etc-selinux\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-hostroot\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459248 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-bin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459261 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-cni-multus\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-var-lib-kubelet\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459255 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-modprobe-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-slash\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-iptables-alerter-script\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-registration-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-slash\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459307 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-hostroot\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-modprobe-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459377 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvxc\" (UniqueName: \"kubernetes.io/projected/3ddb91ad-5ce3-46af-a515-0e68caddf824-kube-api-access-cqvxc\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459377 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3ddb91ad-5ce3-46af-a515-0e68caddf824-registration-dir\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459412 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-multus-certs\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459439 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baf1611e-6531-40fa-9e20-fab0967f4575-konnectivity-ca\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.460903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459466 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj9h2\" (UniqueName: \"kubernetes.io/projected/0d277017-e970-494a-84b8-a3c2be9bbf85-kube-api-access-tj9h2\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459478 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-multus-certs\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459490 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-os-release\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459543 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqwdj\" (UniqueName: \"kubernetes.io/projected/1d694bc1-10aa-4951-95ba-8ec4878fea46-kube-api-access-tqwdj\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459687 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-os-release\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysconfig\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459758 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-log-socket\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459785 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-script-lib\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-netns\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-hosts-file\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cnibin\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-conf\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysconfig\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-tmp\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.459984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-netd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-system-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460041 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-system-cni-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/baf1611e-6531-40fa-9e20-fab0967f4575-konnectivity-ca\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460068 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-ovn\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460099 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-config\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-hosts-file\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-log-socket\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d694bc1-10aa-4951-95ba-8ec4878fea46-host\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d694bc1-10aa-4951-95ba-8ec4878fea46-serviceca\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460260 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-systemd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460287 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-system-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460355 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cni-binary-copy\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460380 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-socket-dir-parent\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460407 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-k8s-cni-cncf-io\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-host\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f57kv\" (UniqueName: \"kubernetes.io/projected/6edae425-482e-4c41-a7dd-25e8579fb7cd-kube-api-access-f57kv\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.462607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460509 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cnibin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-lib-modules\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-ovn\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460630 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-netns\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-openvswitch\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460737 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cnibin\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-conf\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-cni-netd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460802 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-cni-dir\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460810 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d694bc1-10aa-4951-95ba-8ec4878fea46-host\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.460976 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-host-run-ovn-kubernetes\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461079 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1a4a7ddf-880f-4094-a270-f8e491751768-run-systemd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-script-lib\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461136 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-sysctl-d\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461150 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-host-run-k8s-cni-cncf-io\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baf1611e-6531-40fa-9e20-fab0967f4575-agent-certs\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.463364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-multus-socket-dir-parent\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d694bc1-10aa-4951-95ba-8ec4878fea46-serviceca\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461232 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-host\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-cni-binary-copy\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6edae425-482e-4c41-a7dd-25e8579fb7cd-lib-modules\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461294 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-binary-copy\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461364 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:43:38.464198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.461436 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1a4a7ddf-880f-4094-a270-f8e491751768-ovnkube-config\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.464682 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.464618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-tmp\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.464682 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.464666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6edae425-482e-4c41-a7dd-25e8579fb7cd-etc-tuned\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.464857 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.464752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1a4a7ddf-880f-4094-a270-f8e491751768-ovn-node-metrics-cert\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.464857 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.464775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/baf1611e-6531-40fa-9e20-fab0967f4575-agent-certs\") pod \"konnectivity-agent-7c2jq\" (UID: \"baf1611e-6531-40fa-9e20-fab0967f4575\") " pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.475702 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.475668 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:38.475702 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.475701 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:38.475926 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.475733 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:38.475926 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.475812 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:43:38.975792641 +0000 UTC m=+3.113137050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:38.478249 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.478216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rhp\" (UniqueName: \"kubernetes.io/projected/8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97-kube-api-access-l9rhp\") pod \"multus-w4nw5\" (UID: \"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97\") " pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.478403 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.478225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc5wx\" (UniqueName: \"kubernetes.io/projected/f3cf8a0e-7d54-44c9-952a-7acd6af68bc4-kube-api-access-tc5wx\") pod \"node-resolver-2dk4j\" (UID: \"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4\") " pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.478793 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.478767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvxc\" (UniqueName: \"kubernetes.io/projected/3ddb91ad-5ce3-46af-a515-0e68caddf824-kube-api-access-cqvxc\") pod \"aws-ebs-csi-driver-node-dtngj\" (UID: \"3ddb91ad-5ce3-46af-a515-0e68caddf824\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.479271 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.479236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kz44\" (UniqueName: \"kubernetes.io/projected/fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e-kube-api-access-4kz44\") pod \"iptables-alerter-lr9rd\" (UID: \"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e\") " pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.480276 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.480247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqwdj\" (UniqueName: \"kubernetes.io/projected/1d694bc1-10aa-4951-95ba-8ec4878fea46-kube-api-access-tqwdj\") pod \"node-ca-vckqs\" (UID: \"1d694bc1-10aa-4951-95ba-8ec4878fea46\") " pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.480728 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.480684 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jztd\" (UniqueName: \"kubernetes.io/projected/1a4a7ddf-880f-4094-a270-f8e491751768-kube-api-access-2jztd\") pod \"ovnkube-node-zlsv5\" (UID: \"1a4a7ddf-880f-4094-a270-f8e491751768\") " pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.481390 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.481364 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj9h2\" (UniqueName: \"kubernetes.io/projected/0d277017-e970-494a-84b8-a3c2be9bbf85-kube-api-access-tj9h2\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.481764 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.481692 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57kv\" (UniqueName: \"kubernetes.io/projected/6edae425-482e-4c41-a7dd-25e8579fb7cd-kube-api-access-f57kv\") pod \"tuned-n2nr9\" (UID: \"6edae425-482e-4c41-a7dd-25e8579fb7cd\") " pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.562488 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cnibin\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-system-cni-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-binary-copy\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cnibin\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-os-release\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562601 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-system-cni-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.562686 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562676 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgnp7\" (UniqueName: \"kubernetes.io/projected/100cdeaa-300c-4323-8dd0-f6d9ca2702af-kube-api-access-jgnp7\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563050 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562733 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563050 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562756 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-os-release\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563050 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563050 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.562769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/100cdeaa-300c-4323-8dd0-f6d9ca2702af-tuning-conf-dir\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563220 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.563148 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-binary-copy\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563261 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.563213 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.563338 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.563305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/100cdeaa-300c-4323-8dd0-f6d9ca2702af-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.571608 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.571542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgnp7\" (UniqueName: \"kubernetes.io/projected/100cdeaa-300c-4323-8dd0-f6d9ca2702af-kube-api-access-jgnp7\") pod \"multus-additional-cni-plugins-27fs7\" (UID: \"100cdeaa-300c-4323-8dd0-f6d9ca2702af\") " pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.582136 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.582108 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:38.646549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.646515 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w4nw5" Apr 22 18:43:38.655423 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.655392 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lr9rd" Apr 22 18:43:38.663215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.663181 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:38.669068 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.669037 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" Apr 22 18:43:38.674903 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.674868 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vckqs" Apr 22 18:43:38.681873 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.681842 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:43:38.689603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.689562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" Apr 22 18:43:38.696322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.696293 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2dk4j" Apr 22 18:43:38.702070 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.702047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-27fs7" Apr 22 18:43:38.786758 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.786724 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:38.966227 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:38.966150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:38.966371 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.966316 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:38.966418 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:38.966401 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:39.966379459 +0000 UTC m=+4.103723866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:39.006408 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.006370 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a4a7ddf_880f_4094_a270_f8e491751768.slice/crio-10d5561ce13b4da8529cb1c4d979cab411790c190a15ba49a8c789a05ed71eca WatchSource:0}: Error finding container 10d5561ce13b4da8529cb1c4d979cab411790c190a15ba49a8c789a05ed71eca: Status 404 returned error can't find the container with id 10d5561ce13b4da8529cb1c4d979cab411790c190a15ba49a8c789a05ed71eca Apr 22 18:43:39.007733 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.007675 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd96ad2b_c9bd_4b63_9e8b_e619025c1e1e.slice/crio-23c0ddcf1282cf083df8a3d50621a6cd30ba42fa30f902f29d9892c6871c7071 WatchSource:0}: Error finding container 23c0ddcf1282cf083df8a3d50621a6cd30ba42fa30f902f29d9892c6871c7071: Status 404 returned error can't find the container with id 23c0ddcf1282cf083df8a3d50621a6cd30ba42fa30f902f29d9892c6871c7071 Apr 22 18:43:39.008727 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.008691 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ddb91ad_5ce3_46af_a515_0e68caddf824.slice/crio-74f18fe8d2c02f5a680e1ea2ae1eeec64d81585fa2fc5ba4c418553d70460b39 WatchSource:0}: Error finding container 74f18fe8d2c02f5a680e1ea2ae1eeec64d81585fa2fc5ba4c418553d70460b39: Status 404 returned error can't find the container with id 74f18fe8d2c02f5a680e1ea2ae1eeec64d81585fa2fc5ba4c418553d70460b39 Apr 22 18:43:39.009455 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.009331 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod100cdeaa_300c_4323_8dd0_f6d9ca2702af.slice/crio-ad4b90275b78f23cba3e8b4c9c965a5e2ed12b1ca628390e29bdc6b493fb51b8 WatchSource:0}: Error finding container ad4b90275b78f23cba3e8b4c9c965a5e2ed12b1ca628390e29bdc6b493fb51b8: Status 404 returned error can't find the container with id ad4b90275b78f23cba3e8b4c9c965a5e2ed12b1ca628390e29bdc6b493fb51b8 Apr 22 18:43:39.010538 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.010448 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cd178ba_71e9_476e_b2e8_e4a4f6ad3f97.slice/crio-6b7fe83da19ab35f6262723fad7f5c76c59951674d320fa01c68b602c1c0678e WatchSource:0}: Error finding container 6b7fe83da19ab35f6262723fad7f5c76c59951674d320fa01c68b602c1c0678e: Status 404 returned error can't find the container with id 6b7fe83da19ab35f6262723fad7f5c76c59951674d320fa01c68b602c1c0678e Apr 22 18:43:39.015501 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.015376 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d694bc1_10aa_4951_95ba_8ec4878fea46.slice/crio-c76b9d76ea9be17833983d119244348f694e23111c24e58ed35f37aba2c66730 WatchSource:0}: Error finding container c76b9d76ea9be17833983d119244348f694e23111c24e58ed35f37aba2c66730: Status 404 returned error can't find the container with id c76b9d76ea9be17833983d119244348f694e23111c24e58ed35f37aba2c66730 Apr 22 18:43:39.015654 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.015633 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf1611e_6531_40fa_9e20_fab0967f4575.slice/crio-f98fd813a3add47ee0a2f38226581299bf9081c3fa0a61381273a597d512723f WatchSource:0}: Error finding container f98fd813a3add47ee0a2f38226581299bf9081c3fa0a61381273a597d512723f: Status 404 returned error can't find the container with id f98fd813a3add47ee0a2f38226581299bf9081c3fa0a61381273a597d512723f Apr 22 18:43:39.017365 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.017332 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6edae425_482e_4c41_a7dd_25e8579fb7cd.slice/crio-91cb3a9307c73531af19aae06b9a8cfc0db3b719f0a397abcc415ed7a54a8af5 WatchSource:0}: Error finding container 91cb3a9307c73531af19aae06b9a8cfc0db3b719f0a397abcc415ed7a54a8af5: Status 404 returned error can't find the container with id 91cb3a9307c73531af19aae06b9a8cfc0db3b719f0a397abcc415ed7a54a8af5 Apr 22 18:43:39.020146 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:43:39.020119 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3cf8a0e_7d54_44c9_952a_7acd6af68bc4.slice/crio-9e5fde3489ca56cee3f171e206e5e2a2cbfab00cfd02d9015043f8019a661287 WatchSource:0}: Error finding container 9e5fde3489ca56cee3f171e206e5e2a2cbfab00cfd02d9015043f8019a661287: Status 404 returned error can't find the container with id 9e5fde3489ca56cee3f171e206e5e2a2cbfab00cfd02d9015043f8019a661287 Apr 22 18:43:39.067220 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.067048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:39.067220 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.067207 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:39.067220 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.067228 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:39.067461 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.067240 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:39.067461 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.067305 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:43:40.067285323 +0000 UTC m=+4.204629713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:39.430270 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.430194 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:37 +0000 UTC" deadline="2027-12-02 10:13:34.799437282 +0000 UTC" Apr 22 18:43:39.430270 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.430238 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14127h29m55.369203647s" Apr 22 18:43:39.448642 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.448606 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:39.448825 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.448753 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:39.463308 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.463265 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" event={"ID":"6edae425-482e-4c41-a7dd-25e8579fb7cd","Type":"ContainerStarted","Data":"91cb3a9307c73531af19aae06b9a8cfc0db3b719f0a397abcc415ed7a54a8af5"} Apr 22 18:43:39.473044 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.472997 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2dk4j" event={"ID":"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4","Type":"ContainerStarted","Data":"9e5fde3489ca56cee3f171e206e5e2a2cbfab00cfd02d9015043f8019a661287"} Apr 22 18:43:39.476333 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.476300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vckqs" event={"ID":"1d694bc1-10aa-4951-95ba-8ec4878fea46","Type":"ContainerStarted","Data":"c76b9d76ea9be17833983d119244348f694e23111c24e58ed35f37aba2c66730"} Apr 22 18:43:39.484250 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.484211 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4nw5" event={"ID":"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97","Type":"ContainerStarted","Data":"6b7fe83da19ab35f6262723fad7f5c76c59951674d320fa01c68b602c1c0678e"} Apr 22 18:43:39.488820 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.488524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"10d5561ce13b4da8529cb1c4d979cab411790c190a15ba49a8c789a05ed71eca"} Apr 22 18:43:39.490858 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.490829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7c2jq" event={"ID":"baf1611e-6531-40fa-9e20-fab0967f4575","Type":"ContainerStarted","Data":"f98fd813a3add47ee0a2f38226581299bf9081c3fa0a61381273a597d512723f"} Apr 22 18:43:39.492214 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.492188 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" event={"ID":"3ddb91ad-5ce3-46af-a515-0e68caddf824","Type":"ContainerStarted","Data":"74f18fe8d2c02f5a680e1ea2ae1eeec64d81585fa2fc5ba4c418553d70460b39"} Apr 22 18:43:39.493679 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.493654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr9rd" event={"ID":"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e","Type":"ContainerStarted","Data":"23c0ddcf1282cf083df8a3d50621a6cd30ba42fa30f902f29d9892c6871c7071"} Apr 22 18:43:39.499152 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.499118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerStarted","Data":"ad4b90275b78f23cba3e8b4c9c965a5e2ed12b1ca628390e29bdc6b493fb51b8"} Apr 22 18:43:39.504817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.504784 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" event={"ID":"d715d14c3e64c4935a99ec8a0e8eebfa","Type":"ContainerStarted","Data":"60523ac4370df8a17ae5be67113a75509c1f35107b555bb4bf5a688aed55cdcb"} Apr 22 18:43:39.525264 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.525200 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-198.ec2.internal" podStartSLOduration=2.5251796300000002 podStartE2EDuration="2.52517963s" podCreationTimestamp="2026-04-22 18:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:39.524880867 +0000 UTC m=+3.662225293" watchObservedRunningTime="2026-04-22 18:43:39.52517963 +0000 UTC m=+3.662524048" Apr 22 18:43:39.980265 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:39.980108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:39.980265 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.980258 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:39.980474 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:39.980325 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:41.980306558 +0000 UTC m=+6.117650963 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:40.081169 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:40.081127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:40.081348 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:40.081331 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:40.081427 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:40.081352 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:40.081427 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:40.081365 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:40.081527 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:40.081435 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:43:42.081412876 +0000 UTC m=+6.218757268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:40.449388 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:40.448805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:40.449388 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:40.448949 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:40.519762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:40.519526 2569 generic.go:358] "Generic (PLEG): container finished" podID="689f34861518d12cf2af6cfcea46e98b" containerID="f3d177577ebc31873fd023284e62bbb35e502733363511246278b945724aef21" exitCode=0 Apr 22 18:43:40.520188 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:40.520112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" event={"ID":"689f34861518d12cf2af6cfcea46e98b","Type":"ContainerDied","Data":"f3d177577ebc31873fd023284e62bbb35e502733363511246278b945724aef21"} Apr 22 18:43:41.448219 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:41.448183 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:41.448422 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:41.448311 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:41.527739 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:41.527669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" event={"ID":"689f34861518d12cf2af6cfcea46e98b","Type":"ContainerStarted","Data":"ee1b17bb284c00e0dcf3f739e4181506fdd2e4f03e00ccc53522edb8fb9451ed"} Apr 22 18:43:41.996834 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:41.996775 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:41.997032 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:41.996941 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:41.997032 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:41.997029 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:45.99701024 +0000 UTC m=+10.134354634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:42.098022 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:42.097981 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:42.098208 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:42.098176 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:42.098208 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:42.098198 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:42.098312 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:42.098209 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:42.098312 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:42.098269 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:43:46.098250597 +0000 UTC m=+10.235595003 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:42.451457 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:42.450913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:42.451457 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:42.451070 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:43.448838 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:43.448800 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:43.449303 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:43.448939 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:44.448729 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:44.448684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:44.448906 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:44.448855 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:45.448380 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:45.448340 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:45.448562 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:45.448488 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:46.032943 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:46.032898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:46.033511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.033073 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:46.033511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.033162 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:54.033138798 +0000 UTC m=+18.170483191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:46.134018 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:46.133977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:46.134201 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.134171 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:46.134201 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.134196 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:46.134323 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.134209 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:46.134371 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.134270 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:43:54.134250609 +0000 UTC m=+18.271595002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:46.449609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:46.449072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:46.449609 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:46.449197 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:47.448935 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:47.448894 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:47.449398 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:47.449028 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:48.448599 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:48.448566 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:48.448816 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:48.448705 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:49.448983 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:49.448943 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:49.449374 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:49.449057 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:50.449163 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:50.449120 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:50.449584 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:50.449272 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:51.448750 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:51.448696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:51.448943 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:51.448843 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:52.448732 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:52.448675 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:52.449283 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:52.448839 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:53.449171 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:53.449131 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:53.449790 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:53.449248 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:54.092900 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:54.092845 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:54.093113 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.093032 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:54.093183 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.093115 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.093094219 +0000 UTC m=+34.230438623 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:54.193557 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:54.193519 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:54.193774 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.193675 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:54.193774 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.193698 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:54.193774 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.193727 2569 projected.go:194] Error preparing data for projected volume kube-api-access-42b7j for pod openshift-network-diagnostics/network-check-target-k7k5d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:54.193930 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.193797 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j podName:4b1f77d8-6554-4127-b584-1a6f4269f70e nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.193775143 +0000 UTC m=+34.331119558 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-42b7j" (UniqueName: "kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j") pod "network-check-target-k7k5d" (UID: "4b1f77d8-6554-4127-b584-1a6f4269f70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:54.448747 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:54.448646 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:54.448915 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:54.448797 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:55.448887 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:55.448849 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:55.449355 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:55.448987 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:56.448884 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.448699 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:56.449053 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:56.448953 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:56.559237 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.559207 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" event={"ID":"6edae425-482e-4c41-a7dd-25e8579fb7cd","Type":"ContainerStarted","Data":"ad18bb8bf1b87e799d91295f328835f8b788944804124e84ed9d8b8eb93aef02"} Apr 22 18:43:56.560458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.560429 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2dk4j" event={"ID":"f3cf8a0e-7d54-44c9-952a-7acd6af68bc4","Type":"ContainerStarted","Data":"807d08daab45918d430af76f693c48317ce6820ae4fa414fabad0a977abad4da"} Apr 22 18:43:56.561502 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.561481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vckqs" event={"ID":"1d694bc1-10aa-4951-95ba-8ec4878fea46","Type":"ContainerStarted","Data":"bc1470df076fbae54d1f5ab62734f972151b9a4ee2ed47fad4b191aea9115eca"} Apr 22 18:43:56.562631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.562612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w4nw5" event={"ID":"8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97","Type":"ContainerStarted","Data":"2802c48d4841f9375a3665f408c2810f4aef347d169bd59cdce0819b6d38e662"} Apr 22 18:43:56.563895 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.563870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"0de705d717a1eb2645b3bed3fbbf0ba1a2bea57501993b4da52f989355673624"} Apr 22 18:43:56.564980 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.564951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7c2jq" event={"ID":"baf1611e-6531-40fa-9e20-fab0967f4575","Type":"ContainerStarted","Data":"01f304205d0f54519f514c0c728b8be75b427fa1547d60a5e0262256c8b2fe1a"} Apr 22 18:43:56.566003 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.565982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" event={"ID":"3ddb91ad-5ce3-46af-a515-0e68caddf824","Type":"ContainerStarted","Data":"ad39301da22731db536850e2b60680e8ac4d236f0c9010b6d8188bdf8b7021ac"} Apr 22 18:43:56.566970 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.566953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerStarted","Data":"8c0e12a1b0f6a5a9751e66506c5cff9065f1e9243c56dad5c9cb709dd4c1791e"} Apr 22 18:43:56.579150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.579102 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-n2nr9" podStartSLOduration=3.48898443 podStartE2EDuration="20.579090151s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.020733188 +0000 UTC m=+3.158077597" lastFinishedPulling="2026-04-22 18:43:56.110838914 +0000 UTC m=+20.248183318" observedRunningTime="2026-04-22 18:43:56.578835295 +0000 UTC m=+20.716179708" watchObservedRunningTime="2026-04-22 18:43:56.579090151 +0000 UTC m=+20.716434565" Apr 22 18:43:56.579422 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.579403 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-198.ec2.internal" podStartSLOduration=19.579398544 podStartE2EDuration="19.579398544s" podCreationTimestamp="2026-04-22 18:43:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:41.548241858 +0000 UTC m=+5.685586274" watchObservedRunningTime="2026-04-22 18:43:56.579398544 +0000 UTC m=+20.716742988" Apr 22 18:43:56.599348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.599304 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2dk4j" podStartSLOduration=3.509099042 podStartE2EDuration="20.599289646s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.023283845 +0000 UTC m=+3.160628241" lastFinishedPulling="2026-04-22 18:43:56.11347445 +0000 UTC m=+20.250818845" observedRunningTime="2026-04-22 18:43:56.598835321 +0000 UTC m=+20.736179735" watchObservedRunningTime="2026-04-22 18:43:56.599289646 +0000 UTC m=+20.736634058" Apr 22 18:43:56.614482 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.614437 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7c2jq" podStartSLOduration=3.544308991 podStartE2EDuration="20.614422309s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.017399831 +0000 UTC m=+3.154744237" lastFinishedPulling="2026-04-22 18:43:56.087513152 +0000 UTC m=+20.224857555" observedRunningTime="2026-04-22 18:43:56.614018468 +0000 UTC m=+20.751362892" watchObservedRunningTime="2026-04-22 18:43:56.614422309 +0000 UTC m=+20.751766721" Apr 22 18:43:56.656506 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.656441 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w4nw5" podStartSLOduration=3.543335418 podStartE2EDuration="20.656420916s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.013654428 +0000 UTC m=+3.150998823" lastFinishedPulling="2026-04-22 18:43:56.12673993 +0000 UTC m=+20.264084321" observedRunningTime="2026-04-22 18:43:56.655212108 +0000 UTC m=+20.792556534" watchObservedRunningTime="2026-04-22 18:43:56.656420916 +0000 UTC m=+20.793765334" Apr 22 18:43:56.671169 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:56.671109 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vckqs" podStartSLOduration=3.6004651069999998 podStartE2EDuration="20.671087897s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.016891593 +0000 UTC m=+3.154235989" lastFinishedPulling="2026-04-22 18:43:56.087514384 +0000 UTC m=+20.224858779" observedRunningTime="2026-04-22 18:43:56.670533908 +0000 UTC m=+20.807878321" watchObservedRunningTime="2026-04-22 18:43:56.671087897 +0000 UTC m=+20.808432313" Apr 22 18:43:57.448934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.448761 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:57.449076 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:57.449051 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:57.573249 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573222 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:43:57.573569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573544 2569 generic.go:358] "Generic (PLEG): container finished" podID="1a4a7ddf-880f-4094-a270-f8e491751768" containerID="841195bc08ef25438b846f53b4ea66b345f95aac4da0fa8fae88bb5f50599b0b" exitCode=1 Apr 22 18:43:57.573675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573626 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"c93cb5e3fca3a2fb723dc95bee0c3ec3469f07aed87207762091f965fa87960a"} Apr 22 18:43:57.573675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573660 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"a3aa76f1abdb651d35cd5bd26398bbfbae02aa9c32e683236d40eb76431b0311"} Apr 22 18:43:57.573675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"d0ee618f82b926f30b10fd5af04848b26f4eee2a08b38f6a40804b046be87b80"} Apr 22 18:43:57.573812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573685 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"b502cb5a425f99d876797b85a8e65651b78c36d6216ccd4e806ccec61e6fc74b"} Apr 22 18:43:57.573812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.573698 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerDied","Data":"841195bc08ef25438b846f53b4ea66b345f95aac4da0fa8fae88bb5f50599b0b"} Apr 22 18:43:57.575003 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.574976 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lr9rd" event={"ID":"fd96ad2b-c9bd-4b63-9e8b-e619025c1e1e","Type":"ContainerStarted","Data":"d681bd8556a6e3d970cb3525791ab3f419e7e9555da2f3695d6fb7da2f0e5fa8"} Apr 22 18:43:57.576181 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.576157 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="8c0e12a1b0f6a5a9751e66506c5cff9065f1e9243c56dad5c9cb709dd4c1791e" exitCode=0 Apr 22 18:43:57.576332 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.576298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"8c0e12a1b0f6a5a9751e66506c5cff9065f1e9243c56dad5c9cb709dd4c1791e"} Apr 22 18:43:57.620445 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.620353 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lr9rd" podStartSLOduration=4.519265215 podStartE2EDuration="21.620336771s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.009770577 +0000 UTC m=+3.147114981" lastFinishedPulling="2026-04-22 18:43:56.110842132 +0000 UTC m=+20.248186537" observedRunningTime="2026-04-22 18:43:57.592586182 +0000 UTC m=+21.729930595" watchObservedRunningTime="2026-04-22 18:43:57.620336771 +0000 UTC m=+21.757681185" Apr 22 18:43:57.715654 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.715628 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:43:57.804672 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.804639 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:57.805269 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:57.805251 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:43:58.420927 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:58.420737 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:43:57.715647259Z","UUID":"94ddd49d-9a6e-4743-9ea2-20fcb5967f85","Handler":null,"Name":"","Endpoint":""} Apr 22 18:43:58.424652 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:58.424623 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:43:58.424652 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:58.424660 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:43:58.452111 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:58.451959 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:43:58.452111 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:58.452094 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:43:58.581035 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:58.580995 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" event={"ID":"3ddb91ad-5ce3-46af-a515-0e68caddf824","Type":"ContainerStarted","Data":"819ef90179e61150f60711f1442a0dfdfcb79361830f2e2341729199478eb168"} Apr 22 18:43:59.448857 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.448654 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:43:59.449014 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:43:59.448940 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:43:59.585672 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.585633 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" event={"ID":"3ddb91ad-5ce3-46af-a515-0e68caddf824","Type":"ContainerStarted","Data":"9dac95300d72ff8bde86840d3d11acd67f32cca00713f28814bb55b95d37ce32"} Apr 22 18:43:59.590122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.590098 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:43:59.590615 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.590594 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:43:59.590760 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.590635 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"1b094cd493b9d482a16b2504d4221f2211a380ec35ca3154e8f6240488658d98"} Apr 22 18:43:59.610213 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:43:59.610147 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dtngj" podStartSLOduration=3.704700269 podStartE2EDuration="23.610127381s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.011872147 +0000 UTC m=+3.149216541" lastFinishedPulling="2026-04-22 18:43:58.917299258 +0000 UTC m=+23.054643653" observedRunningTime="2026-04-22 18:43:59.609654952 +0000 UTC m=+23.746999365" watchObservedRunningTime="2026-04-22 18:43:59.610127381 +0000 UTC m=+23.747471806" Apr 22 18:44:00.451927 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:00.451893 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:00.452169 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:00.452022 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:44:00.873913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:00.873876 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:44:00.874488 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:00.874004 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:44:00.874560 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:00.874526 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7c2jq" Apr 22 18:44:01.448215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:01.448178 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:01.448403 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:01.448315 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:44:02.451956 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.451776 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:02.452563 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:02.452034 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:44:02.598015 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.597979 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="27bdcc665e6d844168cc06818fe3ad47a976de5b0b3931e35920608bccec7d2a" exitCode=0 Apr 22 18:44:02.598194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.598069 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"27bdcc665e6d844168cc06818fe3ad47a976de5b0b3931e35920608bccec7d2a"} Apr 22 18:44:02.601176 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.601157 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:44:02.601489 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.601457 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"acd4af9c98ed2724988e60e37deb779ddf72d4495c30019b702b749e27ae7c50"} Apr 22 18:44:02.601820 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.601802 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:02.601875 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.601834 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:02.601954 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.601937 2569 scope.go:117] "RemoveContainer" containerID="841195bc08ef25438b846f53b4ea66b345f95aac4da0fa8fae88bb5f50599b0b" Apr 22 18:44:02.617122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:02.617101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:03.448179 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.448104 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:03.448298 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:03.448205 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:44:03.605893 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.605864 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:44:03.606322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.606216 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" event={"ID":"1a4a7ddf-880f-4094-a270-f8e491751768","Type":"ContainerStarted","Data":"c27dd082d9b2dfa2f43737cf0c875bbd345d0dff9f760d08a1f757bbdb789963"} Apr 22 18:44:03.606464 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.606447 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:03.608298 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.608271 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="cb3af9caa32eb6d4b37cb0a204e112dbc04bd4c471fa925e28cb3af4d61269a3" exitCode=0 Apr 22 18:44:03.608417 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.608302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"cb3af9caa32eb6d4b37cb0a204e112dbc04bd4c471fa925e28cb3af4d61269a3"} Apr 22 18:44:03.625957 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.625920 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:03.630581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.630547 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k7k5d"] Apr 22 18:44:03.630736 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.630678 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:03.630812 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:03.630788 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:44:03.632856 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.632831 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srr7v"] Apr 22 18:44:03.632973 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.632944 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:03.633065 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:03.633044 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:44:03.639465 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:03.639415 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" podStartSLOduration=10.193805562 podStartE2EDuration="27.63940163s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.008452122 +0000 UTC m=+3.145796532" lastFinishedPulling="2026-04-22 18:43:56.454048192 +0000 UTC m=+20.591392600" observedRunningTime="2026-04-22 18:44:03.63915705 +0000 UTC m=+27.776501465" watchObservedRunningTime="2026-04-22 18:44:03.63940163 +0000 UTC m=+27.776746027" Apr 22 18:44:04.612449 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:04.612267 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="04249d671739a60f3df3c1f7292502cf0a54104140fd9200537de6f0f3d4d250" exitCode=0 Apr 22 18:44:04.612897 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:04.612355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"04249d671739a60f3df3c1f7292502cf0a54104140fd9200537de6f0f3d4d250"} Apr 22 18:44:05.448390 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:05.448353 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:05.448610 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:05.448409 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:05.448610 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:05.448525 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:44:05.448779 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:05.448671 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:44:07.448757 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:07.448108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:07.448757 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:07.448248 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-k7k5d" podUID="4b1f77d8-6554-4127-b584-1a6f4269f70e" Apr 22 18:44:07.448757 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:07.448108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:07.448757 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:07.448690 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-srr7v" podUID="0d277017-e970-494a-84b8-a3c2be9bbf85" Apr 22 18:44:09.201703 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.201622 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-198.ec2.internal" event="NodeReady" Apr 22 18:44:09.202272 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.201796 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:44:09.262972 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.262929 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8xrxz"] Apr 22 18:44:09.265550 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.265528 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mcc6f"] Apr 22 18:44:09.265699 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.265688 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:09.267523 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.267498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.268446 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.268386 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:44:09.269148 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.268884 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:44:09.269148 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.268964 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:44:09.269148 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.269042 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:44:09.269906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.269884 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:44:09.270002 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.269892 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:44:09.270232 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.270132 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:44:09.274955 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.274699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8xrxz"] Apr 22 18:44:09.279043 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.279018 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mcc6f"] Apr 22 18:44:09.402319 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402284 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.402491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-config-volume\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.402491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402386 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pn9\" (UniqueName: \"kubernetes.io/projected/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-kube-api-access-j9pn9\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.402491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402436 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-tmp-dir\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.402491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrvg\" (UniqueName: \"kubernetes.io/projected/c49b1678-0299-4b9e-a911-9a8cf6fedf32-kube-api-access-5lrvg\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:09.402491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.402487 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:09.448376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.448341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:09.448580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.448336 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:09.451322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.451296 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:44:09.451322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.451303 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:44:09.451510 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.451376 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6clv\"" Apr 22 18:44:09.451510 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.451426 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:44:09.451823 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.451738 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:44:09.503001 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.502965 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-tmp-dir\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.503001 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503007 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrvg\" (UniqueName: \"kubernetes.io/projected/c49b1678-0299-4b9e-a911-9a8cf6fedf32-kube-api-access-5lrvg\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503030 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-config-volume\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:09.503144 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503151 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pn9\" (UniqueName: \"kubernetes.io/projected/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-kube-api-access-j9pn9\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:09.503159 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:09.503221 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.003201077 +0000 UTC m=+34.140545477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:09.503255 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:09.503240 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.003230743 +0000 UTC m=+34.140575137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:09.503652 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503374 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-tmp-dir\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.503921 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.503898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-config-volume\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.514740 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.514688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pn9\" (UniqueName: \"kubernetes.io/projected/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-kube-api-access-j9pn9\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:09.514948 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:09.514926 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrvg\" (UniqueName: \"kubernetes.io/projected/c49b1678-0299-4b9e-a911-9a8cf6fedf32-kube-api-access-5lrvg\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:10.007508 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.007451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:10.007508 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.007521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:10.007895 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.007658 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:10.007895 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.007660 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:10.007895 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.007749 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:11.007706756 +0000 UTC m=+35.145051158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:10.007895 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.007794 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:11.00777654 +0000 UTC m=+35.145120930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:10.108919 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.108878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:10.109113 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.109034 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:10.109113 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:10.109101 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:42.109083457 +0000 UTC m=+66.246427849 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : secret "metrics-daemon-secret" not found Apr 22 18:44:10.209441 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.209400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:10.212029 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.212006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42b7j\" (UniqueName: \"kubernetes.io/projected/4b1f77d8-6554-4127-b584-1a6f4269f70e-kube-api-access-42b7j\") pod \"network-check-target-k7k5d\" (UID: \"4b1f77d8-6554-4127-b584-1a6f4269f70e\") " pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:10.366331 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.366292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:10.587612 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.587362 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-k7k5d"] Apr 22 18:44:10.601386 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:44:10.601342 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b1f77d8_6554_4127_b584_1a6f4269f70e.slice/crio-1fe651b15e179cf1f4f4b6692750e2da193691f8ef46c2c8cd698f89355063c1 WatchSource:0}: Error finding container 1fe651b15e179cf1f4f4b6692750e2da193691f8ef46c2c8cd698f89355063c1: Status 404 returned error can't find the container with id 1fe651b15e179cf1f4f4b6692750e2da193691f8ef46c2c8cd698f89355063c1 Apr 22 18:44:10.625934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.625813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k7k5d" event={"ID":"4b1f77d8-6554-4127-b584-1a6f4269f70e","Type":"ContainerStarted","Data":"1fe651b15e179cf1f4f4b6692750e2da193691f8ef46c2c8cd698f89355063c1"} Apr 22 18:44:10.628512 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:10.628482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerStarted","Data":"ba47094316af9f6d2a662276b528e996ace03eefb348e77e84e905a8e2b1c0c0"} Apr 22 18:44:11.016428 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:11.016330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:11.016428 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:11.016376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:11.016648 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:11.016492 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:11.016648 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:11.016500 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:11.016648 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:11.016556 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:13.016541892 +0000 UTC m=+37.153886293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:11.016648 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:11.016572 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:13.016566508 +0000 UTC m=+37.153910899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:11.633924 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:11.633887 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="ba47094316af9f6d2a662276b528e996ace03eefb348e77e84e905a8e2b1c0c0" exitCode=0 Apr 22 18:44:11.634448 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:11.633933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"ba47094316af9f6d2a662276b528e996ace03eefb348e77e84e905a8e2b1c0c0"} Apr 22 18:44:12.640272 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:12.640233 2569 generic.go:358] "Generic (PLEG): container finished" podID="100cdeaa-300c-4323-8dd0-f6d9ca2702af" containerID="ef2e7ee148456292c853650b545450439cad19348411e86342452cb4cca5d34b" exitCode=0 Apr 22 18:44:12.640802 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:12.640320 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerDied","Data":"ef2e7ee148456292c853650b545450439cad19348411e86342452cb4cca5d34b"} Apr 22 18:44:13.033132 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:13.033034 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:13.033132 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:13.033090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:13.033336 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:13.033205 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:13.033336 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:13.033278 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:17.033261853 +0000 UTC m=+41.170606247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:13.033336 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:13.033205 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:13.033466 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:13.033386 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:17.033364021 +0000 UTC m=+41.170708417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:13.649541 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:13.649337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-27fs7" event={"ID":"100cdeaa-300c-4323-8dd0-f6d9ca2702af","Type":"ContainerStarted","Data":"b178bc3460c23b28373900debba5c16d4343cb8cc1c4512828ff3123a7f318a1"} Apr 22 18:44:13.674965 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:13.674799 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-27fs7" podStartSLOduration=6.3130704 podStartE2EDuration="37.674780491s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:43:39.012407033 +0000 UTC m=+3.149751437" lastFinishedPulling="2026-04-22 18:44:10.374117128 +0000 UTC m=+34.511461528" observedRunningTime="2026-04-22 18:44:13.674533538 +0000 UTC m=+37.811877952" watchObservedRunningTime="2026-04-22 18:44:13.674780491 +0000 UTC m=+37.812124905" Apr 22 18:44:14.652448 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:14.652380 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-k7k5d" event={"ID":"4b1f77d8-6554-4127-b584-1a6f4269f70e","Type":"ContainerStarted","Data":"768fb52262bf47eea0d490dfb0f9686432c5c1835111e39df9a7dd58ab2525f9"} Apr 22 18:44:14.652997 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:14.652747 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:44:14.670385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:14.670330 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-k7k5d" podStartSLOduration=35.657987729 podStartE2EDuration="38.670312469s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:44:10.603549907 +0000 UTC m=+34.740894298" lastFinishedPulling="2026-04-22 18:44:13.615874631 +0000 UTC m=+37.753219038" observedRunningTime="2026-04-22 18:44:14.669631569 +0000 UTC m=+38.806975983" watchObservedRunningTime="2026-04-22 18:44:14.670312469 +0000 UTC m=+38.807656920" Apr 22 18:44:17.062049 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:17.062001 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:17.062511 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:17.062050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:17.062511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:17.062158 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:17.062511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:17.062224 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:25.062208845 +0000 UTC m=+49.199553236 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:17.062511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:17.062163 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:17.062511 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:17.062292 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:25.062280671 +0000 UTC m=+49.199625062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:25.120641 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:25.120604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:25.120641 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:25.120652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:25.121165 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:25.120775 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:25.121165 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:25.120783 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:25.121165 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:25.120830 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:44:41.120816777 +0000 UTC m=+65.258161168 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:25.121165 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:25.120843 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:41.1208375 +0000 UTC m=+65.258181891 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:35.628626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:35.628596 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zlsv5" Apr 22 18:44:41.126096 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:41.126039 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:44:41.126096 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:41.126106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:44:41.126539 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:41.126187 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:41.126539 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:41.126207 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:41.126539 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:41.126259 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:45:13.126244767 +0000 UTC m=+97.263589158 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:44:41.126539 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:41.126272 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:13.126266546 +0000 UTC m=+97.263610937 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:44:42.131366 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:42.131324 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:44:42.131776 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:42.131441 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:42.131776 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:44:42.131497 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs podName:0d277017-e970-494a-84b8-a3c2be9bbf85 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:46.131482056 +0000 UTC m=+130.268826446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs") pod "network-metrics-daemon-srr7v" (UID: "0d277017-e970-494a-84b8-a3c2be9bbf85") : secret "metrics-daemon-secret" not found Apr 22 18:44:46.657775 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:44:46.657626 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-k7k5d" Apr 22 18:45:13.157619 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:13.157558 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:45:13.158138 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:13.157661 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:45:13.158138 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:13.157743 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:45:13.158138 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:13.157767 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:45:13.158138 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:13.157827 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls podName:f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf nodeName:}" failed. No retries permitted until 2026-04-22 18:46:17.15781131 +0000 UTC m=+161.295155701 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls") pod "dns-default-mcc6f" (UID: "f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf") : secret "dns-default-metrics-tls" not found Apr 22 18:45:13.158138 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:13.157844 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert podName:c49b1678-0299-4b9e-a911-9a8cf6fedf32 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:17.157836642 +0000 UTC m=+161.295181033 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert") pod "ingress-canary-8xrxz" (UID: "c49b1678-0299-4b9e-a911-9a8cf6fedf32") : secret "canary-serving-cert" not found Apr 22 18:45:29.664658 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.664624 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn"] Apr 22 18:45:29.666432 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.666408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" Apr 22 18:45:29.669273 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.669244 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.670387 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.670365 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.670507 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.670388 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ldtmv\"" Apr 22 18:45:29.675250 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.675226 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whxx8"] Apr 22 18:45:29.676787 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.676769 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz"] Apr 22 18:45:29.676915 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.676893 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.679477 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.679452 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.680567 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.680547 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:45:29.680985 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.680968 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.681066 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.680988 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:45:29.681130 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.681078 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.681383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.681365 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-b6c4b\"" Apr 22 18:45:29.682145 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.682124 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-87jhn\"" Apr 22 18:45:29.682254 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.682163 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.682254 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.682218 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.682610 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.682588 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:45:29.684285 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.684264 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn"] Apr 22 18:45:29.688527 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.688502 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:45:29.692223 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.692197 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz"] Apr 22 18:45:29.700783 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.700754 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whxx8"] Apr 22 18:45:29.768966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.768893 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ed56c5-616f-4090-b07c-6ea41e001ffb-serving-cert\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.768966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.768975 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-trusted-ca\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.769203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.769013 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqq7l\" (UniqueName: \"kubernetes.io/projected/ce444571-7bf0-4e63-9242-ba9ff25b7c8b-kube-api-access-hqq7l\") pod \"volume-data-source-validator-7c6cbb6c87-zgbjn\" (UID: \"ce444571-7bf0-4e63-9242-ba9ff25b7c8b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" Apr 22 18:45:29.769203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.769033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-config\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.769203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.769066 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.769203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.769135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs4cv\" (UniqueName: \"kubernetes.io/projected/15b8db86-df95-44d0-9435-c1240af09808-kube-api-access-bs4cv\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.769326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.769213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwxvw\" (UniqueName: \"kubernetes.io/projected/f6ed56c5-616f-4090-b07c-6ea41e001ffb-kube-api-access-zwxvw\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.774192 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.774162 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk"] Apr 22 18:45:29.776301 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.776282 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b"] Apr 22 18:45:29.776454 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.776436 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" Apr 22 18:45:29.778087 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.778067 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:45:29.778227 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.778200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.779830 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.779810 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.780895 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.780877 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-vldjv\"" Apr 22 18:45:29.781109 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.781094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:45:29.783106 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.783085 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.784466 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.784446 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:45:29.785859 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.785845 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f7mrt\"" Apr 22 18:45:29.786961 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.786944 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:45:29.787694 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.787676 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:45:29.787990 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.787975 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:45:29.788046 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.787989 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nnjgq\"" Apr 22 18:45:29.790972 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.790912 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.792940 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.792898 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:45:29.797172 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.797148 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk"] Apr 22 18:45:29.800933 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.800894 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:45:29.801899 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.801879 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b"] Apr 22 18:45:29.870349 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwxvw\" (UniqueName: \"kubernetes.io/projected/f6ed56c5-616f-4090-b07c-6ea41e001ffb-kube-api-access-zwxvw\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.870349 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870352 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cba8d754-67a7-482c-b98c-dcd5f2072415-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870377 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870405 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ed56c5-616f-4090-b07c-6ea41e001ffb-serving-cert\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870443 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-trusted-ca\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7qmw\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.870569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqq7l\" (UniqueName: \"kubernetes.io/projected/ce444571-7bf0-4e63-9242-ba9ff25b7c8b-kube-api-access-hqq7l\") pod \"volume-data-source-validator-7c6cbb6c87-zgbjn\" (UID: \"ce444571-7bf0-4e63-9242-ba9ff25b7c8b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-config\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870694 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtgb6\" (UniqueName: \"kubernetes.io/projected/70717fa0-d526-4114-9486-7094f71b02c6-kube-api-access-mtgb6\") pod \"network-check-source-8894fc9bd-8hhzk\" (UID: \"70717fa0-d526-4114-9486-7094f71b02c6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870851 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs4cv\" (UniqueName: \"kubernetes.io/projected/15b8db86-df95-44d0-9435-c1240af09808-kube-api-access-bs4cv\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.870862 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870889 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.870932 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls podName:15b8db86-df95-44d0-9435-c1240af09808 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.370908809 +0000 UTC m=+114.508253218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q7sdz" (UID: "15b8db86-df95-44d0-9435-c1240af09808") : secret "samples-operator-tls" not found Apr 22 18:45:29.870951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870955 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.871352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.870988 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfvdc\" (UniqueName: \"kubernetes.io/projected/cba8d754-67a7-482c-b98c-dcd5f2072415-kube-api-access-bfvdc\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.871352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.871029 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.871352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.871069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.871352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.871089 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.871515 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.871421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-config\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.871515 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.871421 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f6ed56c5-616f-4090-b07c-6ea41e001ffb-trusted-ca\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.874634 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.874604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6ed56c5-616f-4090-b07c-6ea41e001ffb-serving-cert\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.902726 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.902685 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwxvw\" (UniqueName: \"kubernetes.io/projected/f6ed56c5-616f-4090-b07c-6ea41e001ffb-kube-api-access-zwxvw\") pod \"console-operator-9d4b6777b-whxx8\" (UID: \"f6ed56c5-616f-4090-b07c-6ea41e001ffb\") " pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:29.908746 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.908702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs4cv\" (UniqueName: \"kubernetes.io/projected/15b8db86-df95-44d0-9435-c1240af09808-kube-api-access-bs4cv\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:29.920300 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.920226 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l"] Apr 22 18:45:29.920937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.920901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqq7l\" (UniqueName: \"kubernetes.io/projected/ce444571-7bf0-4e63-9242-ba9ff25b7c8b-kube-api-access-hqq7l\") pod \"volume-data-source-validator-7c6cbb6c87-zgbjn\" (UID: \"ce444571-7bf0-4e63-9242-ba9ff25b7c8b\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" Apr 22 18:45:29.922992 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.922977 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:29.930531 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.930513 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:45:29.930772 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.930757 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.931356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.931342 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:45:29.932070 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.932051 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.933375 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.933361 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xmp74\"" Apr 22 18:45:29.966362 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.966331 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l"] Apr 22 18:45:29.971786 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.971925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971795 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da28986a-53c5-4e28-9290-9ebf6cffc2ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:29.971925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971813 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da28986a-53c5-4e28-9290-9ebf6cffc2ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:29.971925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971836 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.971925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfvdc\" (UniqueName: \"kubernetes.io/projected/cba8d754-67a7-482c-b98c-dcd5f2072415-kube-api-access-bfvdc\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.971925 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.971921 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971941 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.971988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972018 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972052 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zcf\" (UniqueName: \"kubernetes.io/projected/da28986a-53c5-4e28-9290-9ebf6cffc2ae-kube-api-access-r9zcf\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cba8d754-67a7-482c-b98c-dcd5f2072415-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972130 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972175 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.972166 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls podName:cba8d754-67a7-482c-b98c-dcd5f2072415 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.472141252 +0000 UTC m=+114.609485650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkd4b" (UID: "cba8d754-67a7-482c-b98c-dcd5f2072415") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972239 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7qmw\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.972307 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972315 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtgb6\" (UniqueName: \"kubernetes.io/projected/70717fa0-d526-4114-9486-7094f71b02c6-kube-api-access-mtgb6\") pod \"network-check-source-8894fc9bd-8hhzk\" (UID: \"70717fa0-d526-4114-9486-7094f71b02c6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.972321 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674b8fcb94-fwbww: secret "image-registry-tls" not found Apr 22 18:45:29.972592 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:29.972453 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls podName:9594788c-903f-4331-bbd8-4125bf68a19c nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.47243229 +0000 UTC m=+114.609776778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls") pod "image-registry-674b8fcb94-fwbww" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c") : secret "image-registry-tls" not found Apr 22 18:45:29.972957 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.972625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.973053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.973034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/cba8d754-67a7-482c-b98c-dcd5f2072415-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:29.973290 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.973271 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.973469 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.973435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.974432 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.974413 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.974524 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.974485 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:29.975351 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.975331 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" Apr 22 18:45:29.990346 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:29.990318 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:30.000230 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.000200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfvdc\" (UniqueName: \"kubernetes.io/projected/cba8d754-67a7-482c-b98c-dcd5f2072415-kube-api-access-bfvdc\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:30.003456 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.003433 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:30.007197 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.007166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7qmw\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:30.018474 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.018443 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtgb6\" (UniqueName: \"kubernetes.io/projected/70717fa0-d526-4114-9486-7094f71b02c6-kube-api-access-mtgb6\") pod \"network-check-source-8894fc9bd-8hhzk\" (UID: \"70717fa0-d526-4114-9486-7094f71b02c6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" Apr 22 18:45:30.074524 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.073382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zcf\" (UniqueName: \"kubernetes.io/projected/da28986a-53c5-4e28-9290-9ebf6cffc2ae-kube-api-access-r9zcf\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.074524 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.073458 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da28986a-53c5-4e28-9290-9ebf6cffc2ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.074524 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.073475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da28986a-53c5-4e28-9290-9ebf6cffc2ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.074524 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.074051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da28986a-53c5-4e28-9290-9ebf6cffc2ae-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.077799 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.077705 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da28986a-53c5-4e28-9290-9ebf6cffc2ae-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.083315 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.083259 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zcf\" (UniqueName: \"kubernetes.io/projected/da28986a-53c5-4e28-9290-9ebf6cffc2ae-kube-api-access-r9zcf\") pod \"kube-storage-version-migrator-operator-6769c5d45-6fj2l\" (UID: \"da28986a-53c5-4e28-9290-9ebf6cffc2ae\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.086518 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.086494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" Apr 22 18:45:30.109569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.109454 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn"] Apr 22 18:45:30.112444 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:30.112410 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce444571_7bf0_4e63_9242_ba9ff25b7c8b.slice/crio-47b96cbbe813f015213b42b8d9fd658d1067ae85420a9a230ae243a164c7043f WatchSource:0}: Error finding container 47b96cbbe813f015213b42b8d9fd658d1067ae85420a9a230ae243a164c7043f: Status 404 returned error can't find the container with id 47b96cbbe813f015213b42b8d9fd658d1067ae85420a9a230ae243a164c7043f Apr 22 18:45:30.124628 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.124602 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-whxx8"] Apr 22 18:45:30.126985 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:30.126948 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6ed56c5_616f_4090_b07c_6ea41e001ffb.slice/crio-7c9bfb9b4c1e0314ade4aab91ebf07b2fe0ead9cc1209eeb59fbcaf9b9201510 WatchSource:0}: Error finding container 7c9bfb9b4c1e0314ade4aab91ebf07b2fe0ead9cc1209eeb59fbcaf9b9201510: Status 404 returned error can't find the container with id 7c9bfb9b4c1e0314ade4aab91ebf07b2fe0ead9cc1209eeb59fbcaf9b9201510 Apr 22 18:45:30.209114 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.209035 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk"] Apr 22 18:45:30.212058 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:30.212031 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70717fa0_d526_4114_9486_7094f71b02c6.slice/crio-7c2f05d406bea89c5add19483a265f5704aa30fe65460d46a26a960333babc36 WatchSource:0}: Error finding container 7c2f05d406bea89c5add19483a265f5704aa30fe65460d46a26a960333babc36: Status 404 returned error can't find the container with id 7c2f05d406bea89c5add19483a265f5704aa30fe65460d46a26a960333babc36 Apr 22 18:45:30.232877 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.232850 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" Apr 22 18:45:30.365465 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.365432 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l"] Apr 22 18:45:30.369837 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:30.369804 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda28986a_53c5_4e28_9290_9ebf6cffc2ae.slice/crio-42da6c7a2428f5950e7c4fa54f7687ccc0a4b946b245ebcdaf80c5e3b4f9e804 WatchSource:0}: Error finding container 42da6c7a2428f5950e7c4fa54f7687ccc0a4b946b245ebcdaf80c5e3b4f9e804: Status 404 returned error can't find the container with id 42da6c7a2428f5950e7c4fa54f7687ccc0a4b946b245ebcdaf80c5e3b4f9e804 Apr 22 18:45:30.375604 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.375579 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:30.375769 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.375751 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:45:30.375843 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.375825 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls podName:15b8db86-df95-44d0-9435-c1240af09808 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.375804223 +0000 UTC m=+115.513148617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q7sdz" (UID: "15b8db86-df95-44d0-9435-c1240af09808") : secret "samples-operator-tls" not found Apr 22 18:45:30.476392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.476295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:30.476392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.476349 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:30.476632 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.476453 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:30.476632 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.476474 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:30.476632 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.476485 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674b8fcb94-fwbww: secret "image-registry-tls" not found Apr 22 18:45:30.476632 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.476530 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls podName:9594788c-903f-4331-bbd8-4125bf68a19c nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.47651356 +0000 UTC m=+115.613857951 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls") pod "image-registry-674b8fcb94-fwbww" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c") : secret "image-registry-tls" not found Apr 22 18:45:30.476632 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:30.476545 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls podName:cba8d754-67a7-482c-b98c-dcd5f2072415 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.476536444 +0000 UTC m=+115.613880839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkd4b" (UID: "cba8d754-67a7-482c-b98c-dcd5f2072415") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:30.803164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.793444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" event={"ID":"ce444571-7bf0-4e63-9242-ba9ff25b7c8b","Type":"ContainerStarted","Data":"47b96cbbe813f015213b42b8d9fd658d1067ae85420a9a230ae243a164c7043f"} Apr 22 18:45:30.803164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.794740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" event={"ID":"da28986a-53c5-4e28-9290-9ebf6cffc2ae","Type":"ContainerStarted","Data":"42da6c7a2428f5950e7c4fa54f7687ccc0a4b946b245ebcdaf80c5e3b4f9e804"} Apr 22 18:45:30.803164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.796959 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" event={"ID":"70717fa0-d526-4114-9486-7094f71b02c6","Type":"ContainerStarted","Data":"1d95ce9d67b07d7cd9c9a42440ebb454d11f9be2541cba5890249dfabcc0be5c"} Apr 22 18:45:30.803164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.796986 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" event={"ID":"70717fa0-d526-4114-9486-7094f71b02c6","Type":"ContainerStarted","Data":"7c2f05d406bea89c5add19483a265f5704aa30fe65460d46a26a960333babc36"} Apr 22 18:45:30.803164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.799053 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" event={"ID":"f6ed56c5-616f-4090-b07c-6ea41e001ffb","Type":"ContainerStarted","Data":"7c9bfb9b4c1e0314ade4aab91ebf07b2fe0ead9cc1209eeb59fbcaf9b9201510"} Apr 22 18:45:30.815036 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:30.814810 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8hhzk" podStartSLOduration=1.814789192 podStartE2EDuration="1.814789192s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:30.813839725 +0000 UTC m=+114.951184139" watchObservedRunningTime="2026-04-22 18:45:30.814789192 +0000 UTC m=+114.952133607" Apr 22 18:45:31.384464 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:31.384411 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:31.384784 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.384667 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:45:31.384784 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.384769 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls podName:15b8db86-df95-44d0-9435-c1240af09808 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.384748427 +0000 UTC m=+117.522092833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q7sdz" (UID: "15b8db86-df95-44d0-9435-c1240af09808") : secret "samples-operator-tls" not found Apr 22 18:45:31.485464 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:31.485422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:31.485659 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:31.485498 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:31.485659 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.485595 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:31.485659 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.485614 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:31.485659 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.485629 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674b8fcb94-fwbww: secret "image-registry-tls" not found Apr 22 18:45:31.485919 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.485666 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls podName:cba8d754-67a7-482c-b98c-dcd5f2072415 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.485647982 +0000 UTC m=+117.622992373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkd4b" (UID: "cba8d754-67a7-482c-b98c-dcd5f2072415") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:31.485919 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:31.485687 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls podName:9594788c-903f-4331-bbd8-4125bf68a19c nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.485677821 +0000 UTC m=+117.623022214 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls") pod "image-registry-674b8fcb94-fwbww" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c") : secret "image-registry-tls" not found Apr 22 18:45:32.806540 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.806495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" event={"ID":"ce444571-7bf0-4e63-9242-ba9ff25b7c8b","Type":"ContainerStarted","Data":"15cc965b0077cdf0c44f5bbf075e32106ba2d0e9f9a45cf47920711a6571f58f"} Apr 22 18:45:32.807828 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.807793 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" event={"ID":"da28986a-53c5-4e28-9290-9ebf6cffc2ae","Type":"ContainerStarted","Data":"cac59480c8c1363e10d8f571f0b27a79b46900d29e6afc025bcad280509b148d"} Apr 22 18:45:32.809212 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.809191 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/0.log" Apr 22 18:45:32.809339 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.809226 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" containerID="6a0023ae39d9977ebc17ade3455ae6e0ca8db46ddfa46bda5030da8534e1a97d" exitCode=255 Apr 22 18:45:32.809339 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.809260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" event={"ID":"f6ed56c5-616f-4090-b07c-6ea41e001ffb","Type":"ContainerDied","Data":"6a0023ae39d9977ebc17ade3455ae6e0ca8db46ddfa46bda5030da8534e1a97d"} Apr 22 18:45:32.809512 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.809494 2569 scope.go:117] "RemoveContainer" containerID="6a0023ae39d9977ebc17ade3455ae6e0ca8db46ddfa46bda5030da8534e1a97d" Apr 22 18:45:32.825591 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.825549 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zgbjn" podStartSLOduration=1.340920029 podStartE2EDuration="3.825530941s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.114144164 +0000 UTC m=+114.251488554" lastFinishedPulling="2026-04-22 18:45:32.598755072 +0000 UTC m=+116.736099466" observedRunningTime="2026-04-22 18:45:32.824881831 +0000 UTC m=+116.962226245" watchObservedRunningTime="2026-04-22 18:45:32.825530941 +0000 UTC m=+116.962875354" Apr 22 18:45:32.843814 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:32.843692 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" podStartSLOduration=1.610445861 podStartE2EDuration="3.843674705s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.371975871 +0000 UTC m=+114.509320262" lastFinishedPulling="2026-04-22 18:45:32.605204714 +0000 UTC m=+116.742549106" observedRunningTime="2026-04-22 18:45:32.84307764 +0000 UTC m=+116.980422063" watchObservedRunningTime="2026-04-22 18:45:32.843674705 +0000 UTC m=+116.981019118" Apr 22 18:45:33.400863 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.400821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:33.401062 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.400997 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:45:33.401108 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.401065 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls podName:15b8db86-df95-44d0-9435-c1240af09808 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.401048346 +0000 UTC m=+121.538392740 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q7sdz" (UID: "15b8db86-df95-44d0-9435-c1240af09808") : secret "samples-operator-tls" not found Apr 22 18:45:33.502218 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.502181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:33.502385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.502271 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:33.502385 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.502363 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:33.502504 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.502389 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674b8fcb94-fwbww: secret "image-registry-tls" not found Apr 22 18:45:33.502504 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.502400 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:33.502504 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.502461 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls podName:9594788c-903f-4331-bbd8-4125bf68a19c nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.502440449 +0000 UTC m=+121.639784863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls") pod "image-registry-674b8fcb94-fwbww" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c") : secret "image-registry-tls" not found Apr 22 18:45:33.502504 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.502480 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls podName:cba8d754-67a7-482c-b98c-dcd5f2072415 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.502470603 +0000 UTC m=+121.639814998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkd4b" (UID: "cba8d754-67a7-482c-b98c-dcd5f2072415") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:33.813179 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813159 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/1.log" Apr 22 18:45:33.813547 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813519 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/0.log" Apr 22 18:45:33.813603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813553 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" exitCode=255 Apr 22 18:45:33.813603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813578 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" event={"ID":"f6ed56c5-616f-4090-b07c-6ea41e001ffb","Type":"ContainerDied","Data":"b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc"} Apr 22 18:45:33.813693 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813614 2569 scope.go:117] "RemoveContainer" containerID="6a0023ae39d9977ebc17ade3455ae6e0ca8db46ddfa46bda5030da8534e1a97d" Apr 22 18:45:33.813955 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:33.813932 2569 scope.go:117] "RemoveContainer" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" Apr 22 18:45:33.814145 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:33.814120 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whxx8_openshift-console-operator(f6ed56c5-616f-4090-b07c-6ea41e001ffb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podUID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" Apr 22 18:45:34.817971 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:34.817945 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/1.log" Apr 22 18:45:34.818352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:34.818288 2569 scope.go:117] "RemoveContainer" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" Apr 22 18:45:34.818457 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:34.818440 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whxx8_openshift-console-operator(f6ed56c5-616f-4090-b07c-6ea41e001ffb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podUID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" Apr 22 18:45:35.258862 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:35.258785 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2dk4j_f3cf8a0e-7d54-44c9-952a-7acd6af68bc4/dns-node-resolver/0.log" Apr 22 18:45:36.659325 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:36.659298 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vckqs_1d694bc1-10aa-4951-95ba-8ec4878fea46/node-ca/0.log" Apr 22 18:45:37.435944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:37.435902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:37.436145 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.436053 2569 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:45:37.436145 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.436132 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls podName:15b8db86-df95-44d0-9435-c1240af09808 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.436115307 +0000 UTC m=+129.573459697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-q7sdz" (UID: "15b8db86-df95-44d0-9435-c1240af09808") : secret "samples-operator-tls" not found Apr 22 18:45:37.536326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:37.536283 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:37.536537 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.536459 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:37.536537 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:37.536473 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:37.536537 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.536485 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674b8fcb94-fwbww: secret "image-registry-tls" not found Apr 22 18:45:37.536701 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.536549 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls podName:9594788c-903f-4331-bbd8-4125bf68a19c nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.5365291 +0000 UTC m=+129.673873491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls") pod "image-registry-674b8fcb94-fwbww" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c") : secret "image-registry-tls" not found Apr 22 18:45:37.536701 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.536579 2569 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:37.536701 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:37.536642 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls podName:cba8d754-67a7-482c-b98c-dcd5f2072415 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.536629927 +0000 UTC m=+129.673974317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hkd4b" (UID: "cba8d754-67a7-482c-b98c-dcd5f2072415") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:45:39.990774 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:39.990733 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:39.990774 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:39.990776 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:39.991194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:39.991121 2569 scope.go:117] "RemoveContainer" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" Apr 22 18:45:39.991296 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:39.991279 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-whxx8_openshift-console-operator(f6ed56c5-616f-4090-b07c-6ea41e001ffb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podUID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" Apr 22 18:45:45.501263 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.501225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:45.503809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.503785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/15b8db86-df95-44d0-9435-c1240af09808-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-q7sdz\" (UID: \"15b8db86-df95-44d0-9435-c1240af09808\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:45.599254 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.599216 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-87jhn\"" Apr 22 18:45:45.601587 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.601567 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:45.601670 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.601653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:45.604146 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.604113 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/cba8d754-67a7-482c-b98c-dcd5f2072415-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hkd4b\" (UID: \"cba8d754-67a7-482c-b98c-dcd5f2072415\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:45.604146 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.604135 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"image-registry-674b8fcb94-fwbww\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:45.606955 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.606935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" Apr 22 18:45:45.696882 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.696852 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-f7mrt\"" Apr 22 18:45:45.701583 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.701554 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-nnjgq\"" Apr 22 18:45:45.704778 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.704756 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" Apr 22 18:45:45.709474 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.709445 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:45.731621 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.731589 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz"] Apr 22 18:45:45.844393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.844343 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" event={"ID":"15b8db86-df95-44d0-9435-c1240af09808","Type":"ContainerStarted","Data":"28822c9c8daf5ddbb5f4fc3f7f3fc55e037d6e28d0a4c29f3c4a49560fd67e14"} Apr 22 18:45:45.844591 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.844570 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b"] Apr 22 18:45:45.848421 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:45.848395 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba8d754_67a7_482c_b98c_dcd5f2072415.slice/crio-b61dfbf352039572266ecdd4714fe7213a4044fa057151bd9888e4e3ce4d7f38 WatchSource:0}: Error finding container b61dfbf352039572266ecdd4714fe7213a4044fa057151bd9888e4e3ce4d7f38: Status 404 returned error can't find the container with id b61dfbf352039572266ecdd4714fe7213a4044fa057151bd9888e4e3ce4d7f38 Apr 22 18:45:45.865834 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:45.865788 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:45:45.868505 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:45.868477 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9594788c_903f_4331_bbd8_4125bf68a19c.slice/crio-8be05e39b42499d0b0370a092bed79fc23b3c0fd518e4e8a27b00f9e1ddbc1f7 WatchSource:0}: Error finding container 8be05e39b42499d0b0370a092bed79fc23b3c0fd518e4e8a27b00f9e1ddbc1f7: Status 404 returned error can't find the container with id 8be05e39b42499d0b0370a092bed79fc23b3c0fd518e4e8a27b00f9e1ddbc1f7 Apr 22 18:45:46.206336 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.206304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:45:46.208487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.208468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0d277017-e970-494a-84b8-a3c2be9bbf85-metrics-certs\") pod \"network-metrics-daemon-srr7v\" (UID: \"0d277017-e970-494a-84b8-a3c2be9bbf85\") " pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:45:46.363438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.363199 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:45:46.370487 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.370452 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-srr7v" Apr 22 18:45:46.518744 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.518688 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-srr7v"] Apr 22 18:45:46.848181 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.848138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" event={"ID":"cba8d754-67a7-482c-b98c-dcd5f2072415","Type":"ContainerStarted","Data":"b61dfbf352039572266ecdd4714fe7213a4044fa057151bd9888e4e3ce4d7f38"} Apr 22 18:45:46.849460 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.849416 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srr7v" event={"ID":"0d277017-e970-494a-84b8-a3c2be9bbf85","Type":"ContainerStarted","Data":"827012b6d96741a9230242e6a60a922157566062fe0139f6e2471b4d0dfd74e2"} Apr 22 18:45:46.850921 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.850894 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" event={"ID":"9594788c-903f-4331-bbd8-4125bf68a19c","Type":"ContainerStarted","Data":"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b"} Apr 22 18:45:46.851058 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.850928 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" event={"ID":"9594788c-903f-4331-bbd8-4125bf68a19c","Type":"ContainerStarted","Data":"8be05e39b42499d0b0370a092bed79fc23b3c0fd518e4e8a27b00f9e1ddbc1f7"} Apr 22 18:45:46.851117 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.851082 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:45:46.878607 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:46.878515 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" podStartSLOduration=17.878495681 podStartE2EDuration="17.878495681s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:46.877761708 +0000 UTC m=+131.015106121" watchObservedRunningTime="2026-04-22 18:45:46.878495681 +0000 UTC m=+131.015840095" Apr 22 18:45:48.859053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.859015 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srr7v" event={"ID":"0d277017-e970-494a-84b8-a3c2be9bbf85","Type":"ContainerStarted","Data":"56d20150c1ad35537bfd6004dc828a47f451c5446083be20e188e7563d78294b"} Apr 22 18:45:48.859527 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.859061 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-srr7v" event={"ID":"0d277017-e970-494a-84b8-a3c2be9bbf85","Type":"ContainerStarted","Data":"d01e35ab1052fb1ff9b7421dce5f80226920cb10d2e4dc02d4e4946c1744fba7"} Apr 22 18:45:48.860580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.860541 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" event={"ID":"15b8db86-df95-44d0-9435-c1240af09808","Type":"ContainerStarted","Data":"f8bd63444a658523370a862a6904fb6185adbb58aa61dab93f24db8446d591ec"} Apr 22 18:45:48.860580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.860577 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" event={"ID":"15b8db86-df95-44d0-9435-c1240af09808","Type":"ContainerStarted","Data":"62ef5ece87247d87cf31d611f69aba3732234e6b0088902a55d5b1f6bf54f984"} Apr 22 18:45:48.861854 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.861832 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" event={"ID":"cba8d754-67a7-482c-b98c-dcd5f2072415","Type":"ContainerStarted","Data":"4795d1920543c93a00bce5dba23060ec7c2d596ad2826d115d81e42ac6613f77"} Apr 22 18:45:48.876638 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.876582 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-srr7v" podStartSLOduration=131.17234572 podStartE2EDuration="2m12.876565549s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:45:46.524467964 +0000 UTC m=+130.661812369" lastFinishedPulling="2026-04-22 18:45:48.228687797 +0000 UTC m=+132.366032198" observedRunningTime="2026-04-22 18:45:48.876294114 +0000 UTC m=+133.013638526" watchObservedRunningTime="2026-04-22 18:45:48.876565549 +0000 UTC m=+133.013909963" Apr 22 18:45:48.895942 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.895884 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-q7sdz" podStartSLOduration=17.457552961 podStartE2EDuration="19.895868867s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:45.787636475 +0000 UTC m=+129.924980882" lastFinishedPulling="2026-04-22 18:45:48.225952379 +0000 UTC m=+132.363296788" observedRunningTime="2026-04-22 18:45:48.895785709 +0000 UTC m=+133.033130123" watchObservedRunningTime="2026-04-22 18:45:48.895868867 +0000 UTC m=+133.033213273" Apr 22 18:45:48.918425 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:48.916135 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hkd4b" podStartSLOduration=17.538348108 podStartE2EDuration="19.916108364s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:45.850156432 +0000 UTC m=+129.987500823" lastFinishedPulling="2026-04-22 18:45:48.227916671 +0000 UTC m=+132.365261079" observedRunningTime="2026-04-22 18:45:48.914360655 +0000 UTC m=+133.051705069" watchObservedRunningTime="2026-04-22 18:45:48.916108364 +0000 UTC m=+133.053452778" Apr 22 18:45:54.449062 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.449032 2569 scope.go:117] "RemoveContainer" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" Apr 22 18:45:54.886181 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886152 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:45:54.886522 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886504 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/1.log" Apr 22 18:45:54.886636 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886542 2569 generic.go:358] "Generic (PLEG): container finished" podID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" containerID="8ca1f391e8d026ea161b71301c6ee56d254d0fdc1b73c1c179f124b2476f9198" exitCode=255 Apr 22 18:45:54.886636 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" event={"ID":"f6ed56c5-616f-4090-b07c-6ea41e001ffb","Type":"ContainerDied","Data":"8ca1f391e8d026ea161b71301c6ee56d254d0fdc1b73c1c179f124b2476f9198"} Apr 22 18:45:54.886731 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886655 2569 scope.go:117] "RemoveContainer" containerID="b007a28d4cc21799289dd886ec9a0b2d4c337bf5ba9df9c178f49bd9414fc7dc" Apr 22 18:45:54.886996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:54.886978 2569 scope.go:117] "RemoveContainer" containerID="8ca1f391e8d026ea161b71301c6ee56d254d0fdc1b73c1c179f124b2476f9198" Apr 22 18:45:54.887190 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:54.887173 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-whxx8_openshift-console-operator(f6ed56c5-616f-4090-b07c-6ea41e001ffb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podUID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" Apr 22 18:45:55.890996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:55.890969 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:45:58.873451 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.873415 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nr9dg"] Apr 22 18:45:58.877889 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.877869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:58.881374 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.881347 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:45:58.882798 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.882768 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:45:58.882798 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.882785 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:45:58.882996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.882785 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zv9q9\"" Apr 22 18:45:58.883433 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.883269 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:45:58.891728 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.891686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nr9dg"] Apr 22 18:45:58.909576 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.909547 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn"] Apr 22 18:45:58.912662 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.912641 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:45:58.917919 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.917890 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:45:58.918078 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.918045 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-xwfmt\"" Apr 22 18:45:58.927663 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.927638 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn"] Apr 22 18:45:58.983810 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:58.983776 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:45:59.000225 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.000187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0970c196-c954-48e1-a580-5cdba184f826-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.000417 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.000237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0970c196-c954-48e1-a580-5cdba184f826-crio-socket\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.000417 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.000257 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0970c196-c954-48e1-a580-5cdba184f826-data-volume\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.000417 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.000336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0970c196-c954-48e1-a580-5cdba184f826-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.000417 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.000404 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9dqm\" (UniqueName: \"kubernetes.io/projected/0970c196-c954-48e1-a580-5cdba184f826-kube-api-access-l9dqm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.066605 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.066569 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c58665cd8-ln2nc"] Apr 22 18:45:59.069744 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.069727 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.095901 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.095872 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c58665cd8-ln2nc"] Apr 22 18:45:59.100972 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.100946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0970c196-c954-48e1-a580-5cdba184f826-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101110 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0970c196-c954-48e1-a580-5cdba184f826-crio-socket\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101110 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0970c196-c954-48e1-a580-5cdba184f826-crio-socket\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101178 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0970c196-c954-48e1-a580-5cdba184f826-data-volume\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101213 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0970c196-c954-48e1-a580-5cdba184f826-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101213 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9dqm\" (UniqueName: \"kubernetes.io/projected/0970c196-c954-48e1-a580-5cdba184f826-kube-api-access-l9dqm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101271 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101236 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bppcn\" (UID: \"52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:45:59.101486 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101467 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0970c196-c954-48e1-a580-5cdba184f826-data-volume\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.101761 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.101739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0970c196-c954-48e1-a580-5cdba184f826-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.103374 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.103355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0970c196-c954-48e1-a580-5cdba184f826-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.128619 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.128540 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9dqm\" (UniqueName: \"kubernetes.io/projected/0970c196-c954-48e1-a580-5cdba184f826-kube-api-access-l9dqm\") pod \"insights-runtime-extractor-nr9dg\" (UID: \"0970c196-c954-48e1-a580-5cdba184f826\") " pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.191062 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.191023 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nr9dg" Apr 22 18:45:59.202049 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-bound-sa-token\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202217 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202067 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-image-registry-private-configuration\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202217 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-installation-pull-secrets\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202217 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202193 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-registry-certificates\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-trusted-ca\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bppcn\" (UID: \"52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:45:59.202398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-registry-tls\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37da8efb-41bd-482a-858a-29a00dd98073-ca-trust-extracted\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.202398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.202389 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z562s\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-kube-api-access-z562s\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.204907 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.204876 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bppcn\" (UID: \"52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:45:59.221665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.221632 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:45:59.303379 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-registry-certificates\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303574 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-trusted-ca\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303574 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303494 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-registry-tls\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303574 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37da8efb-41bd-482a-858a-29a00dd98073-ca-trust-extracted\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303574 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303549 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z562s\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-kube-api-access-z562s\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303596 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-bound-sa-token\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303627 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-image-registry-private-configuration\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.303817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.303683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-installation-pull-secrets\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.304459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.304408 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-registry-certificates\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.304819 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.304590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37da8efb-41bd-482a-858a-29a00dd98073-trusted-ca\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.304819 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.304590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37da8efb-41bd-482a-858a-29a00dd98073-ca-trust-extracted\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.307249 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.307218 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-installation-pull-secrets\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.307817 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.307424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37da8efb-41bd-482a-858a-29a00dd98073-image-registry-private-configuration\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.307967 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.307945 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-registry-tls\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.319287 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.319257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z562s\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-kube-api-access-z562s\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.325407 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.325378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37da8efb-41bd-482a-858a-29a00dd98073-bound-sa-token\") pod \"image-registry-c58665cd8-ln2nc\" (UID: \"37da8efb-41bd-482a-858a-29a00dd98073\") " pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.332961 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.332939 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nr9dg"] Apr 22 18:45:59.335701 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:59.335673 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0970c196_c954_48e1_a580_5cdba184f826.slice/crio-6c99820705d5a486553a38f557ff514efaac6d28dc5697165c09c2aa2561e6d5 WatchSource:0}: Error finding container 6c99820705d5a486553a38f557ff514efaac6d28dc5697165c09c2aa2561e6d5: Status 404 returned error can't find the container with id 6c99820705d5a486553a38f557ff514efaac6d28dc5697165c09c2aa2561e6d5 Apr 22 18:45:59.363735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.363687 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn"] Apr 22 18:45:59.373895 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:59.373862 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52424655_8dfa_4bcb_8b83_ccbf0a6ff7c9.slice/crio-746213c899451788e3ae21f27fd71bdfb35cc73ccc3add568c1cb033b2dbc895 WatchSource:0}: Error finding container 746213c899451788e3ae21f27fd71bdfb35cc73ccc3add568c1cb033b2dbc895: Status 404 returned error can't find the container with id 746213c899451788e3ae21f27fd71bdfb35cc73ccc3add568c1cb033b2dbc895 Apr 22 18:45:59.378486 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.378458 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.515958 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.515917 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c58665cd8-ln2nc"] Apr 22 18:45:59.518457 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:45:59.518431 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37da8efb_41bd_482a_858a_29a00dd98073.slice/crio-08d7d9c5e991c10335ce4676b80da48907b0ad57dc473cf6d9c57541ca9225e3 WatchSource:0}: Error finding container 08d7d9c5e991c10335ce4676b80da48907b0ad57dc473cf6d9c57541ca9225e3: Status 404 returned error can't find the container with id 08d7d9c5e991c10335ce4676b80da48907b0ad57dc473cf6d9c57541ca9225e3 Apr 22 18:45:59.904927 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.904858 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" event={"ID":"37da8efb-41bd-482a-858a-29a00dd98073","Type":"ContainerStarted","Data":"5c806fddd6861a875a828155012a42631750ed4c8114b88f10e60169ff29d5ac"} Apr 22 18:45:59.904927 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.904898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" event={"ID":"37da8efb-41bd-482a-858a-29a00dd98073","Type":"ContainerStarted","Data":"08d7d9c5e991c10335ce4676b80da48907b0ad57dc473cf6d9c57541ca9225e3"} Apr 22 18:45:59.905462 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.905032 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:45:59.906746 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.906665 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" event={"ID":"52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9","Type":"ContainerStarted","Data":"746213c899451788e3ae21f27fd71bdfb35cc73ccc3add568c1cb033b2dbc895"} Apr 22 18:45:59.908723 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.908654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nr9dg" event={"ID":"0970c196-c954-48e1-a580-5cdba184f826","Type":"ContainerStarted","Data":"89d4a1477f55ec35de72f160771b24c10290ed24187c65c41633092b60cbca54"} Apr 22 18:45:59.908723 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.908687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nr9dg" event={"ID":"0970c196-c954-48e1-a580-5cdba184f826","Type":"ContainerStarted","Data":"6c99820705d5a486553a38f557ff514efaac6d28dc5697165c09c2aa2561e6d5"} Apr 22 18:45:59.928033 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.927972 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" podStartSLOduration=0.927948427 podStartE2EDuration="927.948427ms" podCreationTimestamp="2026-04-22 18:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:59.926430196 +0000 UTC m=+144.063774635" watchObservedRunningTime="2026-04-22 18:45:59.927948427 +0000 UTC m=+144.065292819" Apr 22 18:45:59.990431 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.990394 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:59.990550 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.990435 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:45:59.990863 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:45:59.990845 2569 scope.go:117] "RemoveContainer" containerID="8ca1f391e8d026ea161b71301c6ee56d254d0fdc1b73c1c179f124b2476f9198" Apr 22 18:45:59.991055 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:45:59.991037 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-whxx8_openshift-console-operator(f6ed56c5-616f-4090-b07c-6ea41e001ffb)\"" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podUID="f6ed56c5-616f-4090-b07c-6ea41e001ffb" Apr 22 18:46:00.913596 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:00.913553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" event={"ID":"52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9","Type":"ContainerStarted","Data":"d84694223d8d3c5b4e566292504da5e401d212945f7c2ba4081b2f4e9760be72"} Apr 22 18:46:00.914081 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:00.913754 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:46:00.915781 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:00.915729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nr9dg" event={"ID":"0970c196-c954-48e1-a580-5cdba184f826","Type":"ContainerStarted","Data":"73965f89effb903eaba55f5bfc446ec232ea0091f253e6874f5fb56b8c0766f4"} Apr 22 18:46:00.920228 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:00.920204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" Apr 22 18:46:00.933345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:00.933293 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bppcn" podStartSLOduration=1.9325170379999999 podStartE2EDuration="2.93327645s" podCreationTimestamp="2026-04-22 18:45:58 +0000 UTC" firstStartedPulling="2026-04-22 18:45:59.375992748 +0000 UTC m=+143.513337140" lastFinishedPulling="2026-04-22 18:46:00.376752147 +0000 UTC m=+144.514096552" observedRunningTime="2026-04-22 18:46:00.931343052 +0000 UTC m=+145.068687465" watchObservedRunningTime="2026-04-22 18:46:00.93327645 +0000 UTC m=+145.070620862" Apr 22 18:46:01.920673 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.920636 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nr9dg" event={"ID":"0970c196-c954-48e1-a580-5cdba184f826","Type":"ContainerStarted","Data":"90025679369ccb9c2c1f6c28cd4e70a7faa4fe9626ad66bbee55930e4db54831"} Apr 22 18:46:01.949153 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.949097 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nr9dg" podStartSLOduration=1.962545966 podStartE2EDuration="3.949080512s" podCreationTimestamp="2026-04-22 18:45:58 +0000 UTC" firstStartedPulling="2026-04-22 18:45:59.410081374 +0000 UTC m=+143.547425768" lastFinishedPulling="2026-04-22 18:46:01.396615909 +0000 UTC m=+145.533960314" observedRunningTime="2026-04-22 18:46:01.947580852 +0000 UTC m=+146.084925266" watchObservedRunningTime="2026-04-22 18:46:01.949080512 +0000 UTC m=+146.086424928" Apr 22 18:46:01.966791 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.966754 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdk7b"] Apr 22 18:46:01.969725 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.969693 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:01.974735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.974684 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:46:01.974885 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.974684 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:46:01.974885 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.974778 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:46:01.974885 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.974778 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vgghl\"" Apr 22 18:46:01.986391 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:01.986358 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdk7b"] Apr 22 18:46:02.129412 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.129368 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.129412 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.129415 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80a43513-33b5-4856-9cd9-8773d592e73a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.129649 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.129473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7l2\" (UniqueName: \"kubernetes.io/projected/80a43513-33b5-4856-9cd9-8773d592e73a-kube-api-access-8b7l2\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.129649 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.129561 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.230205 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.230106 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.230205 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.230173 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.230205 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.230200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80a43513-33b5-4856-9cd9-8773d592e73a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.230489 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.230225 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7l2\" (UniqueName: \"kubernetes.io/projected/80a43513-33b5-4856-9cd9-8773d592e73a-kube-api-access-8b7l2\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.230489 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:02.230260 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 18:46:02.230489 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:02.230337 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls podName:80a43513-33b5-4856-9cd9-8773d592e73a nodeName:}" failed. No retries permitted until 2026-04-22 18:46:02.730319664 +0000 UTC m=+146.867664075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-kdk7b" (UID: "80a43513-33b5-4856-9cd9-8773d592e73a") : secret "prometheus-operator-tls" not found Apr 22 18:46:02.231014 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.230990 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80a43513-33b5-4856-9cd9-8773d592e73a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.232694 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.232668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.239218 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.239185 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7l2\" (UniqueName: \"kubernetes.io/projected/80a43513-33b5-4856-9cd9-8773d592e73a-kube-api-access-8b7l2\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.733982 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.733936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.736311 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.736292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/80a43513-33b5-4856-9cd9-8773d592e73a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kdk7b\" (UID: \"80a43513-33b5-4856-9cd9-8773d592e73a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:02.878925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:02.878872 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" Apr 22 18:46:03.004274 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:03.004245 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kdk7b"] Apr 22 18:46:03.006858 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:03.006827 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80a43513_33b5_4856_9cd9_8773d592e73a.slice/crio-0f82c9b64e1ff72be4f10aad0b60dfbb54a18e20f580a675bbae0046af84b58b WatchSource:0}: Error finding container 0f82c9b64e1ff72be4f10aad0b60dfbb54a18e20f580a675bbae0046af84b58b: Status 404 returned error can't find the container with id 0f82c9b64e1ff72be4f10aad0b60dfbb54a18e20f580a675bbae0046af84b58b Apr 22 18:46:03.930398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:03.930355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" event={"ID":"80a43513-33b5-4856-9cd9-8773d592e73a","Type":"ContainerStarted","Data":"0f82c9b64e1ff72be4f10aad0b60dfbb54a18e20f580a675bbae0046af84b58b"} Apr 22 18:46:04.936048 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:04.934645 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" event={"ID":"80a43513-33b5-4856-9cd9-8773d592e73a","Type":"ContainerStarted","Data":"3526d337d3fbfb451291d78bf9986e35addf585254a52e990725959bf098bc5a"} Apr 22 18:46:04.936048 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:04.934689 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" event={"ID":"80a43513-33b5-4856-9cd9-8773d592e73a","Type":"ContainerStarted","Data":"70aadfd473e49c20f9c20d55f83ee63ae916a99161e044497984528b0db72e48"} Apr 22 18:46:04.965808 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:04.965745 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kdk7b" podStartSLOduration=2.738196299 podStartE2EDuration="3.965702096s" podCreationTimestamp="2026-04-22 18:46:01 +0000 UTC" firstStartedPulling="2026-04-22 18:46:03.009111926 +0000 UTC m=+147.146456317" lastFinishedPulling="2026-04-22 18:46:04.236617721 +0000 UTC m=+148.373962114" observedRunningTime="2026-04-22 18:46:04.963983946 +0000 UTC m=+149.101328359" watchObservedRunningTime="2026-04-22 18:46:04.965702096 +0000 UTC m=+149.103046507" Apr 22 18:46:07.372513 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.372477 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt"] Apr 22 18:46:07.377203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.377181 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.380104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.380083 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:46:07.380468 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.380443 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:46:07.380547 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.380492 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xd5kf\"" Apr 22 18:46:07.386207 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.386186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt"] Apr 22 18:46:07.400198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.400172 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9stw7"] Apr 22 18:46:07.403193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.403173 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cr2rh"] Apr 22 18:46:07.403361 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.403340 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.406182 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.406155 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:46:07.406338 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.406299 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:46:07.406461 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.406440 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xvkjw\"" Apr 22 18:46:07.406563 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.406547 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:46:07.406694 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.406677 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.409113 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.409094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:46:07.409288 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.409271 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:46:07.409376 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.409354 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-q24sw\"" Apr 22 18:46:07.409658 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.409645 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:46:07.421169 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.421137 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cr2rh"] Apr 22 18:46:07.471656 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.471656 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471653 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-metrics-client-ca\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-accelerators-collector-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471763 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471789 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471816 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-root\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471838 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471852 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-sys\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.471888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-textfile\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471941 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbww\" (UniqueName: \"kubernetes.io/projected/6494b798-d8d1-443d-b970-89c2f4b791cc-kube-api-access-tbbww\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.471984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snm5n\" (UniqueName: \"kubernetes.io/projected/6d348323-659c-4564-a8aa-bc974d5ade37-kube-api-access-snm5n\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.472094 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6494b798-d8d1-443d-b970-89c2f4b791cc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.472150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl29s\" (UniqueName: \"kubernetes.io/projected/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-api-access-gl29s\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.472196 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-wtmp\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.472223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.472326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.472256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.573339 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-root\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573339 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573338 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.573549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-sys\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573397 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-sys\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-root\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573476 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573549 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-textfile\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573580 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.573611 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573635 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbww\" (UniqueName: \"kubernetes.io/projected/6494b798-d8d1-443d-b970-89c2f4b791cc-kube-api-access-tbbww\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.573678 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls podName:6d348323-659c-4564-a8aa-bc974d5ade37 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.073656434 +0000 UTC m=+152.211000844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls") pod "node-exporter-9stw7" (UID: "6d348323-659c-4564-a8aa-bc974d5ade37") : secret "node-exporter-tls" not found Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573705 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snm5n\" (UniqueName: \"kubernetes.io/projected/6d348323-659c-4564-a8aa-bc974d5ade37-kube-api-access-snm5n\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.573768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573760 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6494b798-d8d1-443d-b970-89c2f4b791cc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573788 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl29s\" (UniqueName: \"kubernetes.io/projected/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-api-access-gl29s\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-wtmp\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.574032 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 18:46:07.574053 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-metrics-client-ca\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574072 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.574087 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls podName:6494b798-d8d1-443d-b970-89c2f4b791cc nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.07407098 +0000 UTC m=+152.211415387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-79pxt" (UID: "6494b798-d8d1-443d-b970-89c2f4b791cc") : secret "openshift-state-metrics-tls" not found Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-accelerators-collector-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574219 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-wtmp\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.574325 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:07.574365 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls podName:f0ea930f-4df9-4710-8bf2-32e6e3d0bf39 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.074352384 +0000 UTC m=+152.211696775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-cr2rh" (UID: "f0ea930f-4df9-4710-8bf2-32e6e3d0bf39") : secret "kube-state-metrics-tls" not found Apr 22 18:46:07.574393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.573946 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-textfile\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574930 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574431 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.574991 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-metrics-client-ca\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.574991 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.574928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-accelerators-collector-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.575095 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.575055 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.575149 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.575111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6494b798-d8d1-443d-b970-89c2f4b791cc-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.576411 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.576383 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.576672 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.576651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:07.576923 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.576905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.582491 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.582447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snm5n\" (UniqueName: \"kubernetes.io/projected/6d348323-659c-4564-a8aa-bc974d5ade37-kube-api-access-snm5n\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:07.582610 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.582556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbww\" (UniqueName: \"kubernetes.io/projected/6494b798-d8d1-443d-b970-89c2f4b791cc-kube-api-access-tbbww\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:07.582675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:07.582631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl29s\" (UniqueName: \"kubernetes.io/projected/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-api-access-gl29s\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:08.079055 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.079011 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:08.079272 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.079063 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:08.079272 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.079142 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:08.081631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.081597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d348323-659c-4564-a8aa-bc974d5ade37-node-exporter-tls\") pod \"node-exporter-9stw7\" (UID: \"6d348323-659c-4564-a8aa-bc974d5ade37\") " pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:08.081771 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.081616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0ea930f-4df9-4710-8bf2-32e6e3d0bf39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-cr2rh\" (UID: \"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:08.081771 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.081748 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6494b798-d8d1-443d-b970-89c2f4b791cc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-79pxt\" (UID: \"6494b798-d8d1-443d-b970-89c2f4b791cc\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:08.286954 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.286916 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" Apr 22 18:46:08.313904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.313873 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9stw7" Apr 22 18:46:08.322898 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:08.322861 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d348323_659c_4564_a8aa_bc974d5ade37.slice/crio-8baa5f9fd7f80cfb4c4e82c2513f7a7bbefe30cc9433559524dd0c29a26ecdf7 WatchSource:0}: Error finding container 8baa5f9fd7f80cfb4c4e82c2513f7a7bbefe30cc9433559524dd0c29a26ecdf7: Status 404 returned error can't find the container with id 8baa5f9fd7f80cfb4c4e82c2513f7a7bbefe30cc9433559524dd0c29a26ecdf7 Apr 22 18:46:08.324326 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.324304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" Apr 22 18:46:08.439004 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.438977 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt"] Apr 22 18:46:08.441306 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:08.441276 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6494b798_d8d1_443d_b970_89c2f4b791cc.slice/crio-805cac9e096b3db460e1a18c97e2f09ed662a1eba0899cb757a7417a3363b9aa WatchSource:0}: Error finding container 805cac9e096b3db460e1a18c97e2f09ed662a1eba0899cb757a7417a3363b9aa: Status 404 returned error can't find the container with id 805cac9e096b3db460e1a18c97e2f09ed662a1eba0899cb757a7417a3363b9aa Apr 22 18:46:08.489666 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.489632 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-cr2rh"] Apr 22 18:46:08.492699 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:08.492670 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ea930f_4df9_4710_8bf2_32e6e3d0bf39.slice/crio-0084e83b2f4aeb26cc7625848e5d011e2c760dc618f9560fdd145c5d83ec885c WatchSource:0}: Error finding container 0084e83b2f4aeb26cc7625848e5d011e2c760dc618f9560fdd145c5d83ec885c: Status 404 returned error can't find the container with id 0084e83b2f4aeb26cc7625848e5d011e2c760dc618f9560fdd145c5d83ec885c Apr 22 18:46:08.947277 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.947061 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9stw7" event={"ID":"6d348323-659c-4564-a8aa-bc974d5ade37","Type":"ContainerStarted","Data":"8baa5f9fd7f80cfb4c4e82c2513f7a7bbefe30cc9433559524dd0c29a26ecdf7"} Apr 22 18:46:08.948480 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.948452 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" event={"ID":"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39","Type":"ContainerStarted","Data":"0084e83b2f4aeb26cc7625848e5d011e2c760dc618f9560fdd145c5d83ec885c"} Apr 22 18:46:08.950466 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.950438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" event={"ID":"6494b798-d8d1-443d-b970-89c2f4b791cc","Type":"ContainerStarted","Data":"205cd39892ae0b769d1fa1c5ab3af4123a25a481a3a22c96a6fff0c819445883"} Apr 22 18:46:08.950554 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.950477 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" event={"ID":"6494b798-d8d1-443d-b970-89c2f4b791cc","Type":"ContainerStarted","Data":"fb53aa12337d50e2a7d0bd85988c34333e61f0e5ab498ea7a281507620e86bf0"} Apr 22 18:46:08.950554 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.950492 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" event={"ID":"6494b798-d8d1-443d-b970-89c2f4b791cc","Type":"ContainerStarted","Data":"805cac9e096b3db460e1a18c97e2f09ed662a1eba0899cb757a7417a3363b9aa"} Apr 22 18:46:08.992827 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.992647 2569 patch_prober.go:28] interesting pod/image-registry-674b8fcb94-fwbww container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:46:08.992827 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:08.992735 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:46:09.955810 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:09.955768 2569 generic.go:358] "Generic (PLEG): container finished" podID="6d348323-659c-4564-a8aa-bc974d5ade37" containerID="772dd0a18fb3c0b850346f27e53ddf89bb44c7fe1dc0e9f4c78d5b66df36dd82" exitCode=0 Apr 22 18:46:09.956330 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:09.956291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9stw7" event={"ID":"6d348323-659c-4564-a8aa-bc974d5ade37","Type":"ContainerDied","Data":"772dd0a18fb3c0b850346f27e53ddf89bb44c7fe1dc0e9f4c78d5b66df36dd82"} Apr 22 18:46:09.958847 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:09.958791 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" event={"ID":"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39","Type":"ContainerStarted","Data":"7c43f9ff0f97d692b9c8eefa4dca8a74a3493955d7247289914ab6e723252f06"} Apr 22 18:46:09.964617 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:09.964580 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" event={"ID":"6494b798-d8d1-443d-b970-89c2f4b791cc","Type":"ContainerStarted","Data":"49e3cbe9da78ebbb41091c257103bf5343468b21def1707ad231eeeac6ca04b7"} Apr 22 18:46:10.041289 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:10.041219 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-79pxt" podStartSLOduration=1.792690015 podStartE2EDuration="3.04119561s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:08.576038174 +0000 UTC m=+152.713382569" lastFinishedPulling="2026-04-22 18:46:09.824543773 +0000 UTC m=+153.961888164" observedRunningTime="2026-04-22 18:46:10.038884607 +0000 UTC m=+154.176229044" watchObservedRunningTime="2026-04-22 18:46:10.04119561 +0000 UTC m=+154.178540023" Apr 22 18:46:10.971786 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:10.971741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9stw7" event={"ID":"6d348323-659c-4564-a8aa-bc974d5ade37","Type":"ContainerStarted","Data":"138e234f02df03216e7ab012c324bea4ae49cc806e2e96e4737160780e0bcdc3"} Apr 22 18:46:10.971786 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:10.971794 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9stw7" event={"ID":"6d348323-659c-4564-a8aa-bc974d5ade37","Type":"ContainerStarted","Data":"3730cb298d21f414a309ee22a274bddd6e6ea22a5bdefe813b7b0c3c7528c1ba"} Apr 22 18:46:10.974188 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:10.974163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" event={"ID":"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39","Type":"ContainerStarted","Data":"a9a773f7492f08a471567e259f295e66257392cbd878d568e6876816bfdf2ec1"} Apr 22 18:46:10.974311 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:10.974194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" event={"ID":"f0ea930f-4df9-4710-8bf2-32e6e3d0bf39","Type":"ContainerStarted","Data":"9232094748cdaec710f67699dfae4bdbc42682925573f6976e469b38d29f818d"} Apr 22 18:46:11.027944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:11.027874 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9stw7" podStartSLOduration=3.19025219 podStartE2EDuration="4.027856896s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:08.325300067 +0000 UTC m=+152.462644474" lastFinishedPulling="2026-04-22 18:46:09.162904773 +0000 UTC m=+153.300249180" observedRunningTime="2026-04-22 18:46:11.025231591 +0000 UTC m=+155.162576028" watchObservedRunningTime="2026-04-22 18:46:11.027856896 +0000 UTC m=+155.165201308" Apr 22 18:46:11.105123 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:11.105070 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-cr2rh" podStartSLOduration=2.773338993 podStartE2EDuration="4.105049363s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:08.495186218 +0000 UTC m=+152.632530612" lastFinishedPulling="2026-04-22 18:46:09.826896575 +0000 UTC m=+153.964240982" observedRunningTime="2026-04-22 18:46:11.104613014 +0000 UTC m=+155.241957440" watchObservedRunningTime="2026-04-22 18:46:11.105049363 +0000 UTC m=+155.242393776" Apr 22 18:46:12.279326 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:12.279278 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8xrxz" podUID="c49b1678-0299-4b9e-a911-9a8cf6fedf32" Apr 22 18:46:12.286454 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:12.286414 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mcc6f" podUID="f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf" Apr 22 18:46:12.981402 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:12.981363 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:12.981596 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:12.981414 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:46:15.448450 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:15.448372 2569 scope.go:117] "RemoveContainer" containerID="8ca1f391e8d026ea161b71301c6ee56d254d0fdc1b73c1c179f124b2476f9198" Apr 22 18:46:15.991782 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:15.991753 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:46:15.991934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:15.991851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" event={"ID":"f6ed56c5-616f-4090-b07c-6ea41e001ffb","Type":"ContainerStarted","Data":"11c3fc67793b84db65ce256fea9a0877b08e21b44497c08476728d623334eabc"} Apr 22 18:46:15.992134 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:15.992105 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:46:16.013316 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.013263 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" podStartSLOduration=44.540282076 podStartE2EDuration="47.013245834s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.128893122 +0000 UTC m=+114.266237521" lastFinishedPulling="2026-04-22 18:45:32.601856873 +0000 UTC m=+116.739201279" observedRunningTime="2026-04-22 18:46:16.011483373 +0000 UTC m=+160.148827787" watchObservedRunningTime="2026-04-22 18:46:16.013245834 +0000 UTC m=+160.150590244" Apr 22 18:46:16.032361 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.032331 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-whxx8" Apr 22 18:46:16.228134 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.228073 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4mw78"] Apr 22 18:46:16.231226 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.231211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:16.234017 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.233994 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-r7zh8\"" Apr 22 18:46:16.234397 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.234381 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:46:16.234459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.234445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:46:16.241985 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.241924 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4mw78"] Apr 22 18:46:16.357812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.357777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9z5b\" (UniqueName: \"kubernetes.io/projected/37e5a887-0921-4401-b900-dc37b5b8a892-kube-api-access-j9z5b\") pod \"downloads-6bcc868b7-4mw78\" (UID: \"37e5a887-0921-4401-b900-dc37b5b8a892\") " pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:16.458598 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.458561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9z5b\" (UniqueName: \"kubernetes.io/projected/37e5a887-0921-4401-b900-dc37b5b8a892-kube-api-access-j9z5b\") pod \"downloads-6bcc868b7-4mw78\" (UID: \"37e5a887-0921-4401-b900-dc37b5b8a892\") " pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:16.467086 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.467059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9z5b\" (UniqueName: \"kubernetes.io/projected/37e5a887-0921-4401-b900-dc37b5b8a892-kube-api-access-j9z5b\") pod \"downloads-6bcc868b7-4mw78\" (UID: \"37e5a887-0921-4401-b900-dc37b5b8a892\") " pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:16.540071 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.539980 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:16.687568 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.687537 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4mw78"] Apr 22 18:46:16.693027 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:16.692989 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e5a887_0921_4401_b900_dc37b5b8a892.slice/crio-ad750ce289ed6ac1a1d61943121ffac7b74bae7942485b9396c798df3dc4a57e WatchSource:0}: Error finding container ad750ce289ed6ac1a1d61943121ffac7b74bae7942485b9396c798df3dc4a57e: Status 404 returned error can't find the container with id ad750ce289ed6ac1a1d61943121ffac7b74bae7942485b9396c798df3dc4a57e Apr 22 18:46:16.995604 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:16.995564 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4mw78" event={"ID":"37e5a887-0921-4401-b900-dc37b5b8a892","Type":"ContainerStarted","Data":"ad750ce289ed6ac1a1d61943121ffac7b74bae7942485b9396c798df3dc4a57e"} Apr 22 18:46:17.166009 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.165971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:46:17.166009 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.166022 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:17.168433 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.168402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf-metrics-tls\") pod \"dns-default-mcc6f\" (UID: \"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf\") " pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:17.168537 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.168492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c49b1678-0299-4b9e-a911-9a8cf6fedf32-cert\") pod \"ingress-canary-8xrxz\" (UID: \"c49b1678-0299-4b9e-a911-9a8cf6fedf32\") " pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:46:17.185244 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.185214 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:46:17.186256 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.186235 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:46:17.193382 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.193346 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8xrxz" Apr 22 18:46:17.193469 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.193367 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:17.347028 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.346991 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8xrxz"] Apr 22 18:46:17.351152 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:17.351121 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49b1678_0299_4b9e_a911_9a8cf6fedf32.slice/crio-802d7054948b536642bf26dab949e8e59dbd3f3b123c1ada076e43c0a0fdcb11 WatchSource:0}: Error finding container 802d7054948b536642bf26dab949e8e59dbd3f3b123c1ada076e43c0a0fdcb11: Status 404 returned error can't find the container with id 802d7054948b536642bf26dab949e8e59dbd3f3b123c1ada076e43c0a0fdcb11 Apr 22 18:46:17.375350 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:17.375326 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mcc6f"] Apr 22 18:46:17.377753 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:17.377728 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0952b9b_e2e6_4d5b_872a_d6b8ba86bbdf.slice/crio-eaea55f38a931c5734f1cbe6a0733bca229c1c02cead3e427c1a2c892192b282 WatchSource:0}: Error finding container eaea55f38a931c5734f1cbe6a0733bca229c1c02cead3e427c1a2c892192b282: Status 404 returned error can't find the container with id eaea55f38a931c5734f1cbe6a0733bca229c1c02cead3e427c1a2c892192b282 Apr 22 18:46:18.003028 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:18.002985 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mcc6f" event={"ID":"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf","Type":"ContainerStarted","Data":"eaea55f38a931c5734f1cbe6a0733bca229c1c02cead3e427c1a2c892192b282"} Apr 22 18:46:18.004430 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:18.004384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8xrxz" event={"ID":"c49b1678-0299-4b9e-a911-9a8cf6fedf32","Type":"ContainerStarted","Data":"802d7054948b536642bf26dab949e8e59dbd3f3b123c1ada076e43c0a0fdcb11"} Apr 22 18:46:18.989518 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:18.989417 2569 patch_prober.go:28] interesting pod/image-registry-674b8fcb94-fwbww container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:46:18.989518 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:18.989482 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:46:19.384348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:19.384314 2569 patch_prober.go:28] interesting pod/image-registry-c58665cd8-ln2nc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:46:19.384852 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:19.384376 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" podUID="37da8efb-41bd-482a-858a-29a00dd98073" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:46:20.013802 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.013505 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mcc6f" event={"ID":"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf","Type":"ContainerStarted","Data":"47f49f6d0ea0cee791cfd8e33081f97913a4d8818b1895ced2012ac698f5d8c4"} Apr 22 18:46:20.013802 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.013544 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mcc6f" event={"ID":"f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf","Type":"ContainerStarted","Data":"0c0ff4128b04a3ad800d88e4f4b4d8402b1c0b7ab909930df01926232e9b067c"} Apr 22 18:46:20.013802 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.013693 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:20.015990 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.015960 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8xrxz" event={"ID":"c49b1678-0299-4b9e-a911-9a8cf6fedf32","Type":"ContainerStarted","Data":"8af13c0525f2aa454c40c0e91d8bf32b08c48e51cc0b5f385e5126ba8282a119"} Apr 22 18:46:20.035211 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.035145 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mcc6f" podStartSLOduration=128.856761546 podStartE2EDuration="2m11.035123163s" podCreationTimestamp="2026-04-22 18:44:09 +0000 UTC" firstStartedPulling="2026-04-22 18:46:17.379598792 +0000 UTC m=+161.516943186" lastFinishedPulling="2026-04-22 18:46:19.557960383 +0000 UTC m=+163.695304803" observedRunningTime="2026-04-22 18:46:20.033000141 +0000 UTC m=+164.170344555" watchObservedRunningTime="2026-04-22 18:46:20.035123163 +0000 UTC m=+164.172467578" Apr 22 18:46:20.052657 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.052577 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8xrxz" podStartSLOduration=128.845867816 podStartE2EDuration="2m11.052550582s" podCreationTimestamp="2026-04-22 18:44:09 +0000 UTC" firstStartedPulling="2026-04-22 18:46:17.353565278 +0000 UTC m=+161.490909672" lastFinishedPulling="2026-04-22 18:46:19.560248027 +0000 UTC m=+163.697592438" observedRunningTime="2026-04-22 18:46:20.051125995 +0000 UTC m=+164.188470409" watchObservedRunningTime="2026-04-22 18:46:20.052550582 +0000 UTC m=+164.189894990" Apr 22 18:46:20.920763 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:20.920732 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c58665cd8-ln2nc" Apr 22 18:46:22.334332 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.334296 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:46:22.337206 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.337176 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.340608 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.340581 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-h5w7t\"" Apr 22 18:46:22.340765 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.340581 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:46:22.340765 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.340591 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:46:22.340765 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.340594 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:46:22.341531 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.341416 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:46:22.341531 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.341436 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:46:22.349096 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.349062 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:46:22.421559 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.421756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421577 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.421756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421648 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.421756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvvjm\" (UniqueName: \"kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.421934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421814 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.421934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.421863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.522771 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522725 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvvjm\" (UniqueName: \"kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.522953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522786 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.522953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.522953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522858 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.522953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.523156 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.522958 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.523736 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.523619 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.523736 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.523642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.523894 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.523858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.525764 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.525738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.525878 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.525745 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.537916 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.537891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvvjm\" (UniqueName: \"kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm\") pod \"console-84ddbb5c67-wrktf\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.649908 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.649805 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:22.803953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:22.803848 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:46:22.807200 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:22.807163 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3eedf8_35e0_469d_9e18_020cbe2478b9.slice/crio-bcc6d8a5bbeb3505a82f869063aaae191f88a823aa21fa45cd82f3289cda17a8 WatchSource:0}: Error finding container bcc6d8a5bbeb3505a82f869063aaae191f88a823aa21fa45cd82f3289cda17a8: Status 404 returned error can't find the container with id bcc6d8a5bbeb3505a82f869063aaae191f88a823aa21fa45cd82f3289cda17a8 Apr 22 18:46:23.029557 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:23.029468 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84ddbb5c67-wrktf" event={"ID":"4b3eedf8-35e0-469d-9e18-020cbe2478b9","Type":"ContainerStarted","Data":"bcc6d8a5bbeb3505a82f869063aaae191f88a823aa21fa45cd82f3289cda17a8"} Apr 22 18:46:24.003092 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.003017 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" containerID="cri-o://76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b" gracePeriod=30 Apr 22 18:46:24.281117 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.281090 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:46:24.338441 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338392 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338475 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338519 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338573 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338603 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338872 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338632 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7qmw\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338872 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338695 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.338872 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.338737 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") pod \"9594788c-903f-4331-bbd8-4125bf68a19c\" (UID: \"9594788c-903f-4331-bbd8-4125bf68a19c\") " Apr 22 18:46:24.339668 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.339636 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:24.339886 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.339676 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:24.342408 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.342379 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:24.342705 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.342673 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:24.342937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.342895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:24.343586 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.343546 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw" (OuterVolumeSpecName: "kube-api-access-r7qmw") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "kube-api-access-r7qmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:24.346074 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.346038 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:24.349850 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.349822 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9594788c-903f-4331-bbd8-4125bf68a19c" (UID: "9594788c-903f-4331-bbd8-4125bf68a19c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:46:24.439577 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439534 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-installation-pull-secrets\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439577 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439568 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9594788c-903f-4331-bbd8-4125bf68a19c-ca-trust-extracted\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439577 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439584 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9594788c-903f-4331-bbd8-4125bf68a19c-image-registry-private-configuration\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439598 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-registry-certificates\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439611 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9594788c-903f-4331-bbd8-4125bf68a19c-trusted-ca\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439620 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7qmw\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-kube-api-access-r7qmw\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439629 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-bound-sa-token\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:24.439888 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:24.439638 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9594788c-903f-4331-bbd8-4125bf68a19c-registry-tls\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:46:25.038112 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.038069 2569 generic.go:358] "Generic (PLEG): container finished" podID="9594788c-903f-4331-bbd8-4125bf68a19c" containerID="76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b" exitCode=0 Apr 22 18:46:25.038575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.038149 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" Apr 22 18:46:25.038575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.038196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" event={"ID":"9594788c-903f-4331-bbd8-4125bf68a19c","Type":"ContainerDied","Data":"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b"} Apr 22 18:46:25.038575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.038237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674b8fcb94-fwbww" event={"ID":"9594788c-903f-4331-bbd8-4125bf68a19c","Type":"ContainerDied","Data":"8be05e39b42499d0b0370a092bed79fc23b3c0fd518e4e8a27b00f9e1ddbc1f7"} Apr 22 18:46:25.038575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.038257 2569 scope.go:117] "RemoveContainer" containerID="76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b" Apr 22 18:46:25.057116 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.057076 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:46:25.061133 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.061079 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-674b8fcb94-fwbww"] Apr 22 18:46:25.756469 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.756448 2569 scope.go:117] "RemoveContainer" containerID="76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b" Apr 22 18:46:25.756850 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:46:25.756829 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b\": container with ID starting with 76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b not found: ID does not exist" containerID="76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b" Apr 22 18:46:25.756907 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:25.756861 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b"} err="failed to get container status \"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b\": rpc error: code = NotFound desc = could not find container \"76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b\": container with ID starting with 76bbba12a4f27f86758cf69e68c3be7c2bfecc8f82d5cf93930d392a5b156b8b not found: ID does not exist" Apr 22 18:46:26.043879 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:26.043788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84ddbb5c67-wrktf" event={"ID":"4b3eedf8-35e0-469d-9e18-020cbe2478b9","Type":"ContainerStarted","Data":"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903"} Apr 22 18:46:26.064646 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:26.064584 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84ddbb5c67-wrktf" podStartSLOduration=1.088601057 podStartE2EDuration="4.064564263s" podCreationTimestamp="2026-04-22 18:46:22 +0000 UTC" firstStartedPulling="2026-04-22 18:46:22.809665112 +0000 UTC m=+166.947009504" lastFinishedPulling="2026-04-22 18:46:25.785628305 +0000 UTC m=+169.922972710" observedRunningTime="2026-04-22 18:46:26.063250652 +0000 UTC m=+170.200595067" watchObservedRunningTime="2026-04-22 18:46:26.064564263 +0000 UTC m=+170.201908679" Apr 22 18:46:26.455696 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:26.455662 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" path="/var/lib/kubelet/pods/9594788c-903f-4331-bbd8-4125bf68a19c/volumes" Apr 22 18:46:30.022695 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:30.022663 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mcc6f" Apr 22 18:46:31.330507 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.330466 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:46:31.331034 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.330874 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" Apr 22 18:46:31.331034 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.330893 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" Apr 22 18:46:31.331034 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.330990 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="9594788c-903f-4331-bbd8-4125bf68a19c" containerName="registry" Apr 22 18:46:31.334019 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.333992 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.344262 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.344226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:46:31.350575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.350542 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:46:31.404525 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404472 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404525 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnlgr\" (UniqueName: \"kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404812 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.404855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.404846 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506152 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506231 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506287 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnlgr\" (UniqueName: \"kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.506881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.506352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.508193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.508161 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.508621 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.508404 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.509047 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.508997 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.509300 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.509270 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.510148 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.510110 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.510526 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.510505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.517207 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.517170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnlgr\" (UniqueName: \"kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr\") pod \"console-5d68fd887-gbs86\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:31.646356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:31.646248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:32.650770 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:32.650731 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:32.651248 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:32.650802 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:32.656353 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:32.656326 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:33.072410 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:33.072374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:46:34.281593 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:34.281562 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:46:34.284640 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:46:34.284610 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fe5138_06a1_4672_b71c_fc09aff96f0a.slice/crio-86fb57e73b04e62545d7a6c0dd2afa0716575f73bc4938349c2e093aa67e1a93 WatchSource:0}: Error finding container 86fb57e73b04e62545d7a6c0dd2afa0716575f73bc4938349c2e093aa67e1a93: Status 404 returned error can't find the container with id 86fb57e73b04e62545d7a6c0dd2afa0716575f73bc4938349c2e093aa67e1a93 Apr 22 18:46:35.076494 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.076448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4mw78" event={"ID":"37e5a887-0921-4401-b900-dc37b5b8a892","Type":"ContainerStarted","Data":"4adce7e3dd1c22a93720cfddec83637e097f8a3132e6eeb954e4fb9f0e2af093"} Apr 22 18:46:35.076796 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.076596 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:35.078224 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.078193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d68fd887-gbs86" event={"ID":"b9fe5138-06a1-4672-b71c-fc09aff96f0a","Type":"ContainerStarted","Data":"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255"} Apr 22 18:46:35.078400 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.078385 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d68fd887-gbs86" event={"ID":"b9fe5138-06a1-4672-b71c-fc09aff96f0a","Type":"ContainerStarted","Data":"86fb57e73b04e62545d7a6c0dd2afa0716575f73bc4938349c2e093aa67e1a93"} Apr 22 18:46:35.090969 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.090941 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4mw78" Apr 22 18:46:35.097089 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.097020 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4mw78" podStartSLOduration=1.551796662 podStartE2EDuration="19.096997228s" podCreationTimestamp="2026-04-22 18:46:16 +0000 UTC" firstStartedPulling="2026-04-22 18:46:16.695533839 +0000 UTC m=+160.832878239" lastFinishedPulling="2026-04-22 18:46:34.240734398 +0000 UTC m=+178.378078805" observedRunningTime="2026-04-22 18:46:35.094125692 +0000 UTC m=+179.231470116" watchObservedRunningTime="2026-04-22 18:46:35.096997228 +0000 UTC m=+179.234341642" Apr 22 18:46:35.119104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:35.119038 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d68fd887-gbs86" podStartSLOduration=4.119016462 podStartE2EDuration="4.119016462s" podCreationTimestamp="2026-04-22 18:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:35.118383963 +0000 UTC m=+179.255728377" watchObservedRunningTime="2026-04-22 18:46:35.119016462 +0000 UTC m=+179.256360878" Apr 22 18:46:41.647308 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:41.647262 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:41.647839 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:41.647324 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:41.653215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:41.653190 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:42.107369 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:42.107338 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:46:42.161472 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:42.161439 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:46:44.109684 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:44.109644 2569 generic.go:358] "Generic (PLEG): container finished" podID="da28986a-53c5-4e28-9290-9ebf6cffc2ae" containerID="cac59480c8c1363e10d8f571f0b27a79b46900d29e6afc025bcad280509b148d" exitCode=0 Apr 22 18:46:44.110154 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:44.109722 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" event={"ID":"da28986a-53c5-4e28-9290-9ebf6cffc2ae","Type":"ContainerDied","Data":"cac59480c8c1363e10d8f571f0b27a79b46900d29e6afc025bcad280509b148d"} Apr 22 18:46:44.110154 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:44.110090 2569 scope.go:117] "RemoveContainer" containerID="cac59480c8c1363e10d8f571f0b27a79b46900d29e6afc025bcad280509b148d" Apr 22 18:46:45.114163 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:45.114128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6fj2l" event={"ID":"da28986a-53c5-4e28-9290-9ebf6cffc2ae","Type":"ContainerStarted","Data":"0bec7402b0e6c31b9960738ffa93f905fd0088130d50f001426b29d13a9b2317"} Apr 22 18:46:54.077161 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:46:54.077132 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8xrxz_c49b1678-0299-4b9e-a911-9a8cf6fedf32/serve-healthcheck-canary/0.log" Apr 22 18:47:07.186208 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.186140 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84ddbb5c67-wrktf" podUID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" containerName="console" containerID="cri-o://1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903" gracePeriod=15 Apr 22 18:47:07.476883 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.476855 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84ddbb5c67-wrktf_4b3eedf8-35e0-469d-9e18-020cbe2478b9/console/0.log" Apr 22 18:47:07.477030 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.476921 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:47:07.526140 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526107 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526327 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526147 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526327 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526187 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526327 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526219 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526327 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526249 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvvjm\" (UniqueName: \"kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526327 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526272 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca\") pod \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\" (UID: \"4b3eedf8-35e0-469d-9e18-020cbe2478b9\") " Apr 22 18:47:07.526730 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526682 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config" (OuterVolumeSpecName: "console-config") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:07.526818 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.526734 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca" (OuterVolumeSpecName: "service-ca") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:07.527029 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.527003 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:07.528865 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.528825 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm" (OuterVolumeSpecName: "kube-api-access-rvvjm") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "kube-api-access-rvvjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:07.528865 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.528835 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:07.529550 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.529511 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4b3eedf8-35e0-469d-9e18-020cbe2478b9" (UID: "4b3eedf8-35e0-469d-9e18-020cbe2478b9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:07.627365 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627328 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-oauth-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:07.627365 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627362 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:07.627365 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627373 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-oauth-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:07.627589 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627382 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvvjm\" (UniqueName: \"kubernetes.io/projected/4b3eedf8-35e0-469d-9e18-020cbe2478b9-kube-api-access-rvvjm\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:07.627589 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627392 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4b3eedf8-35e0-469d-9e18-020cbe2478b9-service-ca\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:07.627589 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:07.627401 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4b3eedf8-35e0-469d-9e18-020cbe2478b9-console-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:47:08.188896 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.188869 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84ddbb5c67-wrktf_4b3eedf8-35e0-469d-9e18-020cbe2478b9/console/0.log" Apr 22 18:47:08.189293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.188910 2569 generic.go:358] "Generic (PLEG): container finished" podID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" containerID="1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903" exitCode=2 Apr 22 18:47:08.189293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.188944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84ddbb5c67-wrktf" event={"ID":"4b3eedf8-35e0-469d-9e18-020cbe2478b9","Type":"ContainerDied","Data":"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903"} Apr 22 18:47:08.189293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.188973 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84ddbb5c67-wrktf" Apr 22 18:47:08.189293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.188987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84ddbb5c67-wrktf" event={"ID":"4b3eedf8-35e0-469d-9e18-020cbe2478b9","Type":"ContainerDied","Data":"bcc6d8a5bbeb3505a82f869063aaae191f88a823aa21fa45cd82f3289cda17a8"} Apr 22 18:47:08.189293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.189005 2569 scope.go:117] "RemoveContainer" containerID="1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903" Apr 22 18:47:08.197782 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.197765 2569 scope.go:117] "RemoveContainer" containerID="1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903" Apr 22 18:47:08.198050 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:47:08.198033 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903\": container with ID starting with 1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903 not found: ID does not exist" containerID="1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903" Apr 22 18:47:08.198095 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.198058 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903"} err="failed to get container status \"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903\": rpc error: code = NotFound desc = could not find container \"1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903\": container with ID starting with 1c5bf3467649f92535f1d438130ad8e0af680436bab92d2eea35917bf6864903 not found: ID does not exist" Apr 22 18:47:08.210660 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.210626 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:47:08.212548 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.212525 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84ddbb5c67-wrktf"] Apr 22 18:47:08.453513 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:47:08.453433 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" path="/var/lib/kubelet/pods/4b3eedf8-35e0-469d-9e18-020cbe2478b9/volumes" Apr 22 18:48:00.632116 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.632034 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z5sdb"] Apr 22 18:48:00.632525 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.632374 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" containerName="console" Apr 22 18:48:00.632525 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.632388 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" containerName="console" Apr 22 18:48:00.632525 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.632482 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b3eedf8-35e0-469d-9e18-020cbe2478b9" containerName="console" Apr 22 18:48:00.635344 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.635326 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.637150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.637123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-original-pull-secret\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.637265 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.637168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-kubelet-config\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.637265 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.637230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-dbus\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.638010 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.637993 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:48:00.644989 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.644966 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z5sdb"] Apr 22 18:48:00.736909 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.736874 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:48:00.738349 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.738328 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-original-pull-secret\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.738452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.738365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-kubelet-config\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.738452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.738405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-dbus\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.738550 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.738528 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-kubelet-config\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.738586 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.738556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-dbus\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.739936 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.739913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.740813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.740792 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a8ed4379-675b-4bd5-92be-4ce4a4aa88a7-original-pull-secret\") pod \"global-pull-secret-syncer-z5sdb\" (UID: \"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7\") " pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.754698 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.754669 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:48:00.839413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839413 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvq7f\" (UniqueName: \"kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839564 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839605 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.839644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.839626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940253 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940329 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940348 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvq7f\" (UniqueName: \"kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.940644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.940469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.941274 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.941246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.941374 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.941272 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.941374 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.941274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.941449 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.941386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.942920 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.942898 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.942987 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.942967 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:00.945812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.945790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z5sdb" Apr 22 18:48:00.949428 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:00.949404 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvq7f\" (UniqueName: \"kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f\") pod \"console-57c7686d4d-wtrkk\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:01.062392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.062345 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:01.083974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.083941 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z5sdb"] Apr 22 18:48:01.087567 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:01.087530 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ed4379_675b_4bd5_92be_4ce4a4aa88a7.slice/crio-98db7398bbf2415b54a6d28ba9396d7d50ef8a297883ce21e8827adb916e7ed0 WatchSource:0}: Error finding container 98db7398bbf2415b54a6d28ba9396d7d50ef8a297883ce21e8827adb916e7ed0: Status 404 returned error can't find the container with id 98db7398bbf2415b54a6d28ba9396d7d50ef8a297883ce21e8827adb916e7ed0 Apr 22 18:48:01.197302 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.197274 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:48:01.199694 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:01.199642 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaccf0b03_b9e8_40ed_9e8b_59bab6047583.slice/crio-524d9dc916f3756242d366577873bb21ba64f9f118dc266c6006ba97e09acb56 WatchSource:0}: Error finding container 524d9dc916f3756242d366577873bb21ba64f9f118dc266c6006ba97e09acb56: Status 404 returned error can't find the container with id 524d9dc916f3756242d366577873bb21ba64f9f118dc266c6006ba97e09acb56 Apr 22 18:48:01.344617 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.344580 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c7686d4d-wtrkk" event={"ID":"accf0b03-b9e8-40ed-9e8b-59bab6047583","Type":"ContainerStarted","Data":"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c"} Apr 22 18:48:01.344617 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.344623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c7686d4d-wtrkk" event={"ID":"accf0b03-b9e8-40ed-9e8b-59bab6047583","Type":"ContainerStarted","Data":"524d9dc916f3756242d366577873bb21ba64f9f118dc266c6006ba97e09acb56"} Apr 22 18:48:01.345748 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.345700 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z5sdb" event={"ID":"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7","Type":"ContainerStarted","Data":"98db7398bbf2415b54a6d28ba9396d7d50ef8a297883ce21e8827adb916e7ed0"} Apr 22 18:48:01.367259 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:01.367208 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57c7686d4d-wtrkk" podStartSLOduration=1.367191536 podStartE2EDuration="1.367191536s" podCreationTimestamp="2026-04-22 18:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:01.365884811 +0000 UTC m=+265.503229236" watchObservedRunningTime="2026-04-22 18:48:01.367191536 +0000 UTC m=+265.504535949" Apr 22 18:48:05.359052 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:05.359016 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z5sdb" event={"ID":"a8ed4379-675b-4bd5-92be-4ce4a4aa88a7","Type":"ContainerStarted","Data":"94b681f4ad88171aea71493ec3ad0051cbd851904b909cc8dfffe35dd90072c4"} Apr 22 18:48:05.376075 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:05.375972 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z5sdb" podStartSLOduration=1.353986215 podStartE2EDuration="5.375956571s" podCreationTimestamp="2026-04-22 18:48:00 +0000 UTC" firstStartedPulling="2026-04-22 18:48:01.089401268 +0000 UTC m=+265.226745661" lastFinishedPulling="2026-04-22 18:48:05.111371621 +0000 UTC m=+269.248716017" observedRunningTime="2026-04-22 18:48:05.374545431 +0000 UTC m=+269.511889845" watchObservedRunningTime="2026-04-22 18:48:05.375956571 +0000 UTC m=+269.513300984" Apr 22 18:48:11.063187 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:11.063142 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:11.063187 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:11.063196 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:11.067822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:11.067797 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:11.380959 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:11.380882 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:48:11.430170 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:11.430131 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:48:12.686467 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.686425 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb"] Apr 22 18:48:12.696030 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.696001 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.698613 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.698580 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:48:12.698773 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.698629 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb"] Apr 22 18:48:12.699575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.699545 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:48:12.699700 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.699571 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:48:12.730471 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.730433 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.730471 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.730471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghdmz\" (UniqueName: \"kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.730725 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.730500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.831209 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.831165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.831209 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.831208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghdmz\" (UniqueName: \"kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.831467 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.831250 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.831656 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.831615 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.831763 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.831627 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:12.839933 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:12.839901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghdmz\" (UniqueName: \"kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:13.006815 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:13.006707 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:13.131044 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:13.131015 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb"] Apr 22 18:48:13.133485 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:13.133457 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3523209_96d6_46ba_8b9c_f5efb3036e37.slice/crio-cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3 WatchSource:0}: Error finding container cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3: Status 404 returned error can't find the container with id cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3 Apr 22 18:48:13.383675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:13.383642 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerStarted","Data":"cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3"} Apr 22 18:48:20.406872 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:20.406837 2569 generic.go:358] "Generic (PLEG): container finished" podID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerID="023fadb37c5b6d7eaaa2aac12692785556f1b6122237d732728704283597a506" exitCode=0 Apr 22 18:48:20.407275 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:20.406898 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerDied","Data":"023fadb37c5b6d7eaaa2aac12692785556f1b6122237d732728704283597a506"} Apr 22 18:48:23.417126 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:23.417078 2569 generic.go:358] "Generic (PLEG): container finished" podID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerID="13c3220dd901832b7bec12ec5b0bb34216791cd99b1909c743975fccf36bbb81" exitCode=0 Apr 22 18:48:23.417636 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:23.417156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerDied","Data":"13c3220dd901832b7bec12ec5b0bb34216791cd99b1909c743975fccf36bbb81"} Apr 22 18:48:33.451353 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:33.451309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerStarted","Data":"34dbb029f1ab3fc87701c9b05e77ad614022a071348cc8d5f3ec21c1e3c997eb"} Apr 22 18:48:33.472452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:33.472390 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" podStartSLOduration=1.254310459 podStartE2EDuration="21.472373045s" podCreationTimestamp="2026-04-22 18:48:12 +0000 UTC" firstStartedPulling="2026-04-22 18:48:13.13544652 +0000 UTC m=+277.272790914" lastFinishedPulling="2026-04-22 18:48:33.353509091 +0000 UTC m=+297.490853500" observedRunningTime="2026-04-22 18:48:33.470723398 +0000 UTC m=+297.608067809" watchObservedRunningTime="2026-04-22 18:48:33.472373045 +0000 UTC m=+297.609717457" Apr 22 18:48:34.455791 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:34.455755 2569 generic.go:358] "Generic (PLEG): container finished" podID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerID="34dbb029f1ab3fc87701c9b05e77ad614022a071348cc8d5f3ec21c1e3c997eb" exitCode=0 Apr 22 18:48:34.456158 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:34.455813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerDied","Data":"34dbb029f1ab3fc87701c9b05e77ad614022a071348cc8d5f3ec21c1e3c997eb"} Apr 22 18:48:35.582609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.582586 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:35.740028 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.739927 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghdmz\" (UniqueName: \"kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz\") pod \"f3523209-96d6-46ba-8b9c-f5efb3036e37\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " Apr 22 18:48:35.740028 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.740022 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util\") pod \"f3523209-96d6-46ba-8b9c-f5efb3036e37\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " Apr 22 18:48:35.740254 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.740075 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle\") pod \"f3523209-96d6-46ba-8b9c-f5efb3036e37\" (UID: \"f3523209-96d6-46ba-8b9c-f5efb3036e37\") " Apr 22 18:48:35.740797 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.740765 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle" (OuterVolumeSpecName: "bundle") pod "f3523209-96d6-46ba-8b9c-f5efb3036e37" (UID: "f3523209-96d6-46ba-8b9c-f5efb3036e37"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:35.742173 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.742148 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz" (OuterVolumeSpecName: "kube-api-access-ghdmz") pod "f3523209-96d6-46ba-8b9c-f5efb3036e37" (UID: "f3523209-96d6-46ba-8b9c-f5efb3036e37"). InnerVolumeSpecName "kube-api-access-ghdmz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:35.744105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.744076 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util" (OuterVolumeSpecName: "util") pod "f3523209-96d6-46ba-8b9c-f5efb3036e37" (UID: "f3523209-96d6-46ba-8b9c-f5efb3036e37"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:35.841038 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.840987 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:35.841038 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.841032 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ghdmz\" (UniqueName: \"kubernetes.io/projected/f3523209-96d6-46ba-8b9c-f5efb3036e37-kube-api-access-ghdmz\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:35.841038 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:35.841044 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f3523209-96d6-46ba-8b9c-f5efb3036e37-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.364095 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.364066 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:48:36.364291 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.364157 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:48:36.370652 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.370628 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:48:36.370886 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.370862 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:48:36.373289 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.373272 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:48:36.451332 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.451178 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d68fd887-gbs86" podUID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" containerName="console" containerID="cri-o://1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255" gracePeriod=15 Apr 22 18:48:36.464104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.462547 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" event={"ID":"f3523209-96d6-46ba-8b9c-f5efb3036e37","Type":"ContainerDied","Data":"cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3"} Apr 22 18:48:36.464104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.462573 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea3cda6ec9a299224b7aeed0e97b59b8d755d3151b8def6563d12ee806ad4b3" Apr 22 18:48:36.464104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.462606 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4drtb" Apr 22 18:48:36.696468 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.696445 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d68fd887-gbs86_b9fe5138-06a1-4672-b71c-fc09aff96f0a/console/0.log" Apr 22 18:48:36.696800 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.696513 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:48:36.849619 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849576 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849619 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849622 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnlgr\" (UniqueName: \"kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849648 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849680 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849762 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849795 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.849913 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.849828 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config\") pod \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\" (UID: \"b9fe5138-06a1-4672-b71c-fc09aff96f0a\") " Apr 22 18:48:36.850301 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.850269 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:36.850397 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.850296 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:36.850397 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.850300 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:36.850397 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.850346 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config" (OuterVolumeSpecName: "console-config") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:48:36.852124 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.852105 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:36.852395 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.852371 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:48:36.852395 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.852385 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr" (OuterVolumeSpecName: "kube-api-access-tnlgr") pod "b9fe5138-06a1-4672-b71c-fc09aff96f0a" (UID: "b9fe5138-06a1-4672-b71c-fc09aff96f0a"). InnerVolumeSpecName "kube-api-access-tnlgr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951039 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-trusted-ca-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951070 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-oauth-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951080 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-service-ca\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951090 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951099 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951107 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnlgr\" (UniqueName: \"kubernetes.io/projected/b9fe5138-06a1-4672-b71c-fc09aff96f0a-kube-api-access-tnlgr\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:36.951122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:36.951118 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9fe5138-06a1-4672-b71c-fc09aff96f0a-console-oauth-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:37.466271 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d68fd887-gbs86_b9fe5138-06a1-4672-b71c-fc09aff96f0a/console/0.log" Apr 22 18:48:37.466459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466282 2569 generic.go:358] "Generic (PLEG): container finished" podID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" containerID="1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255" exitCode=2 Apr 22 18:48:37.466459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466352 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d68fd887-gbs86" Apr 22 18:48:37.466459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466365 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d68fd887-gbs86" event={"ID":"b9fe5138-06a1-4672-b71c-fc09aff96f0a","Type":"ContainerDied","Data":"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255"} Apr 22 18:48:37.466459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d68fd887-gbs86" event={"ID":"b9fe5138-06a1-4672-b71c-fc09aff96f0a","Type":"ContainerDied","Data":"86fb57e73b04e62545d7a6c0dd2afa0716575f73bc4938349c2e093aa67e1a93"} Apr 22 18:48:37.466459 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.466406 2569 scope.go:117] "RemoveContainer" containerID="1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255" Apr 22 18:48:37.475699 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.475676 2569 scope.go:117] "RemoveContainer" containerID="1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255" Apr 22 18:48:37.476015 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:48:37.475993 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255\": container with ID starting with 1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255 not found: ID does not exist" containerID="1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255" Apr 22 18:48:37.476072 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.476029 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255"} err="failed to get container status \"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255\": rpc error: code = NotFound desc = could not find container \"1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255\": container with ID starting with 1b6a01731154dae5ef9b9782a07f9b63ff95eaeb1fa9d040f2bb4fe5c9554255 not found: ID does not exist" Apr 22 18:48:37.488112 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.488083 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:48:37.491987 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:37.491959 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d68fd887-gbs86"] Apr 22 18:48:38.457080 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:38.457047 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" path="/var/lib/kubelet/pods/b9fe5138-06a1-4672-b71c-fc09aff96f0a/volumes" Apr 22 18:48:40.739854 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.739826 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf"] Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740106 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" containerName="console" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740117 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" containerName="console" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740126 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="pull" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740132 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="pull" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740139 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="extract" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740144 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="extract" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740155 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="util" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740160 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="util" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740214 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9fe5138-06a1-4672-b71c-fc09aff96f0a" containerName="console" Apr 22 18:48:40.740242 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.740221 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3523209-96d6-46ba-8b9c-f5efb3036e37" containerName="extract" Apr 22 18:48:40.744643 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.744624 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.748213 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.748185 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:40.748347 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.748250 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:40.748764 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.748750 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-48hvp\"" Apr 22 18:48:40.767298 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.767260 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf"] Apr 22 18:48:40.880216 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.880180 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2rck\" (UniqueName: \"kubernetes.io/projected/0f469169-e236-429d-857b-85d7c79dbc88-kube-api-access-n2rck\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.880400 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.880250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f469169-e236-429d-857b-85d7c79dbc88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.981557 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.981516 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f469169-e236-429d-857b-85d7c79dbc88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.981767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.981577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2rck\" (UniqueName: \"kubernetes.io/projected/0f469169-e236-429d-857b-85d7c79dbc88-kube-api-access-n2rck\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.981928 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.981908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f469169-e236-429d-857b-85d7c79dbc88-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:40.996264 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:40.996191 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2rck\" (UniqueName: \"kubernetes.io/projected/0f469169-e236-429d-857b-85d7c79dbc88-kube-api-access-n2rck\") pod \"cert-manager-operator-controller-manager-54b9655956-8qvqf\" (UID: \"0f469169-e236-429d-857b-85d7c79dbc88\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:41.053861 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:41.053823 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" Apr 22 18:48:41.186809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:41.186725 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf"] Apr 22 18:48:41.189590 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:41.189556 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f469169_e236_429d_857b_85d7c79dbc88.slice/crio-9951d30ff10bc637e6ce6c9d36bcccda293625ed0667abded8866563c4b48933 WatchSource:0}: Error finding container 9951d30ff10bc637e6ce6c9d36bcccda293625ed0667abded8866563c4b48933: Status 404 returned error can't find the container with id 9951d30ff10bc637e6ce6c9d36bcccda293625ed0667abded8866563c4b48933 Apr 22 18:48:41.192063 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:41.192044 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:48:41.480793 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:41.480757 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" event={"ID":"0f469169-e236-429d-857b-85d7c79dbc88","Type":"ContainerStarted","Data":"9951d30ff10bc637e6ce6c9d36bcccda293625ed0667abded8866563c4b48933"} Apr 22 18:48:43.489747 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:43.489688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" event={"ID":"0f469169-e236-429d-857b-85d7c79dbc88","Type":"ContainerStarted","Data":"d28238d5c0d5b1640830c1387146a8e1195d9eb2433b84fe579d410ba14c6766"} Apr 22 18:48:43.515458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:43.515405 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8qvqf" podStartSLOduration=1.314065178 podStartE2EDuration="3.515390171s" podCreationTimestamp="2026-04-22 18:48:40 +0000 UTC" firstStartedPulling="2026-04-22 18:48:41.192186724 +0000 UTC m=+305.329531129" lastFinishedPulling="2026-04-22 18:48:43.393511727 +0000 UTC m=+307.530856122" observedRunningTime="2026-04-22 18:48:43.513623868 +0000 UTC m=+307.650968281" watchObservedRunningTime="2026-04-22 18:48:43.515390171 +0000 UTC m=+307.652734584" Apr 22 18:48:44.567750 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.567694 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp"] Apr 22 18:48:44.571012 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.570990 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.574132 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.574107 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:48:44.574256 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.574179 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:48:44.575097 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.575075 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:48:44.588132 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.588104 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp"] Apr 22 18:48:44.712089 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.712051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.712089 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.712091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbqpj\" (UniqueName: \"kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.712324 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.712126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.813056 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.813014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.813056 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.813050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbqpj\" (UniqueName: \"kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.813329 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.813080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.813504 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.813484 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.813568 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.813543 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.821953 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.821891 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbqpj\" (UniqueName: \"kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:44.880900 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:44.880861 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:45.013580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:45.013553 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp"] Apr 22 18:48:45.016229 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:45.016191 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ed9a9d_5778_495e_8588_a4e65a25045d.slice/crio-14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28 WatchSource:0}: Error finding container 14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28: Status 404 returned error can't find the container with id 14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28 Apr 22 18:48:45.502926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:45.502833 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerID="ca23de3829fdd15fa5fd5a81a2a5080868d4036062c3529175e99a40aea23f82" exitCode=0 Apr 22 18:48:45.503107 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:45.502924 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" event={"ID":"e7ed9a9d-5778-495e-8588-a4e65a25045d","Type":"ContainerDied","Data":"ca23de3829fdd15fa5fd5a81a2a5080868d4036062c3529175e99a40aea23f82"} Apr 22 18:48:45.503107 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:45.502964 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" event={"ID":"e7ed9a9d-5778-495e-8588-a4e65a25045d","Type":"ContainerStarted","Data":"14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28"} Apr 22 18:48:46.861626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.861589 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-52gfq"] Apr 22 18:48:46.866604 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.866575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:46.869849 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.869826 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:48:46.870908 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.870883 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-8vzhz\"" Apr 22 18:48:46.871023 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.870920 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:48:46.875874 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:46.875851 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-52gfq"] Apr 22 18:48:47.035401 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.035355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.035567 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.035424 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-kube-api-access-6kksl\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.136761 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.136660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.136761 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.136757 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-kube-api-access-6kksl\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.162635 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.162572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.162826 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.162736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kksl\" (UniqueName: \"kubernetes.io/projected/a48871ba-2058-43e3-86e9-16dcd5a93387-kube-api-access-6kksl\") pod \"cert-manager-webhook-587ccfb98-52gfq\" (UID: \"a48871ba-2058-43e3-86e9-16dcd5a93387\") " pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.189427 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.189395 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:47.667516 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:47.667494 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-52gfq"] Apr 22 18:48:47.669799 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:48:47.669765 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda48871ba_2058_43e3_86e9_16dcd5a93387.slice/crio-7699126ad27d03d164bfd0e860460b940f1e09543383ba96cc89a81c3f1d4e3a WatchSource:0}: Error finding container 7699126ad27d03d164bfd0e860460b940f1e09543383ba96cc89a81c3f1d4e3a: Status 404 returned error can't find the container with id 7699126ad27d03d164bfd0e860460b940f1e09543383ba96cc89a81c3f1d4e3a Apr 22 18:48:48.515062 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:48.514966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" event={"ID":"a48871ba-2058-43e3-86e9-16dcd5a93387","Type":"ContainerStarted","Data":"7699126ad27d03d164bfd0e860460b940f1e09543383ba96cc89a81c3f1d4e3a"} Apr 22 18:48:48.516560 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:48.516537 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerID="dce9631fccf08105173bced9735a8c6c8ac681546645f37ecf25004aa5bece6c" exitCode=0 Apr 22 18:48:48.516662 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:48.516584 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" event={"ID":"e7ed9a9d-5778-495e-8588-a4e65a25045d","Type":"ContainerDied","Data":"dce9631fccf08105173bced9735a8c6c8ac681546645f37ecf25004aa5bece6c"} Apr 22 18:48:49.523035 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:49.522995 2569 generic.go:358] "Generic (PLEG): container finished" podID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerID="dc9caede22db29338a276a819b3775bc412e2c9d4a275116c232821bc2c525a5" exitCode=0 Apr 22 18:48:49.523451 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:49.523075 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" event={"ID":"e7ed9a9d-5778-495e-8588-a4e65a25045d","Type":"ContainerDied","Data":"dc9caede22db29338a276a819b3775bc412e2c9d4a275116c232821bc2c525a5"} Apr 22 18:48:50.654129 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.654104 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:50.771779 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.771655 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util\") pod \"e7ed9a9d-5778-495e-8588-a4e65a25045d\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " Apr 22 18:48:50.771779 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.771751 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbqpj\" (UniqueName: \"kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj\") pod \"e7ed9a9d-5778-495e-8588-a4e65a25045d\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " Apr 22 18:48:50.771978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.771851 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle\") pod \"e7ed9a9d-5778-495e-8588-a4e65a25045d\" (UID: \"e7ed9a9d-5778-495e-8588-a4e65a25045d\") " Apr 22 18:48:50.772302 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.772270 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle" (OuterVolumeSpecName: "bundle") pod "e7ed9a9d-5778-495e-8588-a4e65a25045d" (UID: "e7ed9a9d-5778-495e-8588-a4e65a25045d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:50.773915 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.773884 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj" (OuterVolumeSpecName: "kube-api-access-vbqpj") pod "e7ed9a9d-5778-495e-8588-a4e65a25045d" (UID: "e7ed9a9d-5778-495e-8588-a4e65a25045d"). InnerVolumeSpecName "kube-api-access-vbqpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:48:50.775691 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.775666 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util" (OuterVolumeSpecName: "util") pod "e7ed9a9d-5778-495e-8588-a4e65a25045d" (UID: "e7ed9a9d-5778-495e-8588-a4e65a25045d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:48:50.872580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.872541 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:50.872580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.872568 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e7ed9a9d-5778-495e-8588-a4e65a25045d-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:50.872580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:50.872579 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vbqpj\" (UniqueName: \"kubernetes.io/projected/e7ed9a9d-5778-495e-8588-a4e65a25045d-kube-api-access-vbqpj\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:48:51.530603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.530563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" event={"ID":"a48871ba-2058-43e3-86e9-16dcd5a93387","Type":"ContainerStarted","Data":"244684c22fabc9e474de8e108828a7e11c7f918f6258d9a289cea9a0130bda1a"} Apr 22 18:48:51.530827 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.530656 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:51.532217 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.532194 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" event={"ID":"e7ed9a9d-5778-495e-8588-a4e65a25045d","Type":"ContainerDied","Data":"14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28"} Apr 22 18:48:51.532332 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.532223 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14fea8daccdcc85ddfe69c1a4cd762ecd105c0800d3125ac12e348b6c7dcba28" Apr 22 18:48:51.532332 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.532225 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fc9gjp" Apr 22 18:48:51.548505 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:51.548457 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" podStartSLOduration=2.742417149 podStartE2EDuration="5.548444312s" podCreationTimestamp="2026-04-22 18:48:46 +0000 UTC" firstStartedPulling="2026-04-22 18:48:47.672283798 +0000 UTC m=+311.809628189" lastFinishedPulling="2026-04-22 18:48:50.478310957 +0000 UTC m=+314.615655352" observedRunningTime="2026-04-22 18:48:51.547683238 +0000 UTC m=+315.685027653" watchObservedRunningTime="2026-04-22 18:48:51.548444312 +0000 UTC m=+315.685788725" Apr 22 18:48:57.441388 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441347 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hs8z2"] Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441648 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="extract" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441659 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="extract" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441671 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="util" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441676 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="util" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441688 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="pull" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441694 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="pull" Apr 22 18:48:57.441809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.441755 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7ed9a9d-5778-495e-8588-a4e65a25045d" containerName="extract" Apr 22 18:48:57.448414 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.448389 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.450975 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.450926 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mmr8r\"" Apr 22 18:48:57.452699 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.452674 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hs8z2"] Apr 22 18:48:57.524562 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.524530 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-bound-sa-token\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.524562 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.524567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lzf8\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-kube-api-access-7lzf8\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.538406 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.538380 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-52gfq" Apr 22 18:48:57.624920 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.624884 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-bound-sa-token\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.624920 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.624918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lzf8\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-kube-api-access-7lzf8\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.634153 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.634122 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-bound-sa-token\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.634696 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.634679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lzf8\" (UniqueName: \"kubernetes.io/projected/178fd53d-c559-44c9-a809-5bbc9441448e-kube-api-access-7lzf8\") pod \"cert-manager-79c8d999ff-hs8z2\" (UID: \"178fd53d-c559-44c9-a809-5bbc9441448e\") " pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.759266 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.759168 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-hs8z2" Apr 22 18:48:57.882609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:57.882184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-hs8z2"] Apr 22 18:48:58.557223 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:58.557184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hs8z2" event={"ID":"178fd53d-c559-44c9-a809-5bbc9441448e","Type":"ContainerStarted","Data":"47ae0d34329f4a327eab3905187667cd288ed938a8428ff4cfd2a6eb3b1bdd2a"} Apr 22 18:48:58.557223 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:58.557218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-hs8z2" event={"ID":"178fd53d-c559-44c9-a809-5bbc9441448e","Type":"ContainerStarted","Data":"4f44a2ee7414783f84876675d4cc6206f29ba450a92f694f2a7702df4652a4d6"} Apr 22 18:48:58.576086 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:48:58.576036 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-hs8z2" podStartSLOduration=1.576020473 podStartE2EDuration="1.576020473s" podCreationTimestamp="2026-04-22 18:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:58.574671283 +0000 UTC m=+322.712015697" watchObservedRunningTime="2026-04-22 18:48:58.576020473 +0000 UTC m=+322.713364886" Apr 22 18:49:06.417085 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.417052 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64"] Apr 22 18:49:06.440153 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.440113 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64"] Apr 22 18:49:06.440312 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.440250 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.443207 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.443180 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:49:06.444164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.444144 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:49:06.444164 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.444152 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:49:06.496681 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.496645 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.496681 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.496696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.496921 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.496753 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnvt6\" (UniqueName: \"kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.597310 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.597280 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.597310 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.597313 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.597530 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.597336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnvt6\" (UniqueName: \"kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.597734 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.597694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.597780 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.597770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.614156 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.614122 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnvt6\" (UniqueName: \"kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.752247 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.752162 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:06.908230 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:06.908174 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64"] Apr 22 18:49:06.911088 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:49:06.911058 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a369a4_4f92_43e0_8ddc_41ace0f81912.slice/crio-4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a WatchSource:0}: Error finding container 4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a: Status 404 returned error can't find the container with id 4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a Apr 22 18:49:07.588774 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:07.588737 2569 generic.go:358] "Generic (PLEG): container finished" podID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerID="c17e7f8f1cd5b7eb689ebe857adbb7236c779c201cb53f523ec8e12101624ff3" exitCode=0 Apr 22 18:49:07.589220 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:07.588811 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" event={"ID":"36a369a4-4f92-43e0-8ddc-41ace0f81912","Type":"ContainerDied","Data":"c17e7f8f1cd5b7eb689ebe857adbb7236c779c201cb53f523ec8e12101624ff3"} Apr 22 18:49:07.589220 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:07.588852 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" event={"ID":"36a369a4-4f92-43e0-8ddc-41ace0f81912","Type":"ContainerStarted","Data":"4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a"} Apr 22 18:49:08.593689 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:08.593601 2569 generic.go:358] "Generic (PLEG): container finished" podID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerID="466431467e92b5f6f8e506d6fbdad77b640a50050a3467ad56ecade999bdf1f2" exitCode=0 Apr 22 18:49:08.594104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:08.593737 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" event={"ID":"36a369a4-4f92-43e0-8ddc-41ace0f81912","Type":"ContainerDied","Data":"466431467e92b5f6f8e506d6fbdad77b640a50050a3467ad56ecade999bdf1f2"} Apr 22 18:49:09.599140 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:09.599108 2569 generic.go:358] "Generic (PLEG): container finished" podID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerID="f7f7e0b78fe6d7bd8e96bd7e5891ebfeb2d6f4120ca2b885ee1c0e50efb44667" exitCode=0 Apr 22 18:49:09.599540 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:09.599196 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" event={"ID":"36a369a4-4f92-43e0-8ddc-41ace0f81912","Type":"ContainerDied","Data":"f7f7e0b78fe6d7bd8e96bd7e5891ebfeb2d6f4120ca2b885ee1c0e50efb44667"} Apr 22 18:49:10.728875 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.728848 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:10.835815 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.835774 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle\") pod \"36a369a4-4f92-43e0-8ddc-41ace0f81912\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " Apr 22 18:49:10.835815 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.835813 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnvt6\" (UniqueName: \"kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6\") pod \"36a369a4-4f92-43e0-8ddc-41ace0f81912\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " Apr 22 18:49:10.836075 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.835839 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util\") pod \"36a369a4-4f92-43e0-8ddc-41ace0f81912\" (UID: \"36a369a4-4f92-43e0-8ddc-41ace0f81912\") " Apr 22 18:49:10.836617 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.836590 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle" (OuterVolumeSpecName: "bundle") pod "36a369a4-4f92-43e0-8ddc-41ace0f81912" (UID: "36a369a4-4f92-43e0-8ddc-41ace0f81912"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:10.838052 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.838029 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6" (OuterVolumeSpecName: "kube-api-access-mnvt6") pod "36a369a4-4f92-43e0-8ddc-41ace0f81912" (UID: "36a369a4-4f92-43e0-8ddc-41ace0f81912"). InnerVolumeSpecName "kube-api-access-mnvt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:10.841747 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.841673 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util" (OuterVolumeSpecName: "util") pod "36a369a4-4f92-43e0-8ddc-41ace0f81912" (UID: "36a369a4-4f92-43e0-8ddc-41ace0f81912"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:10.936845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.936761 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:10.936845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.936790 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnvt6\" (UniqueName: \"kubernetes.io/projected/36a369a4-4f92-43e0-8ddc-41ace0f81912-kube-api-access-mnvt6\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:10.936845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:10.936800 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36a369a4-4f92-43e0-8ddc-41ace0f81912-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:11.608809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:11.608762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" event={"ID":"36a369a4-4f92-43e0-8ddc-41ace0f81912","Type":"ContainerDied","Data":"4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a"} Apr 22 18:49:11.608809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:11.608815 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a4d251b67349da4ca07e188137b7e8aa8b403c7fca0922447338a0dc033530a" Apr 22 18:49:11.608809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:11.608773 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5ffx64" Apr 22 18:49:17.829951 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.829871 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv"] Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830175 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="util" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830186 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="util" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830203 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="pull" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830209 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="pull" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830220 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="extract" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830227 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="extract" Apr 22 18:49:17.830385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.830278 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="36a369a4-4f92-43e0-8ddc-41ace0f81912" containerName="extract" Apr 22 18:49:17.833356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.833338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.835891 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.835863 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:49:17.836824 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.836804 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:49:17.836824 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.836820 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:49:17.842994 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.842970 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv"] Apr 22 18:49:17.893802 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.893771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.893994 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.893833 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8cd\" (UniqueName: \"kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.893994 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.893868 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.995172 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.995135 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.995360 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.995199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8cd\" (UniqueName: \"kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.995360 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.995234 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.995619 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.995595 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:17.995657 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:17.995609 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:18.009797 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.009765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8cd\" (UniqueName: \"kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:18.143936 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.143852 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:18.276110 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.276074 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv"] Apr 22 18:49:18.279691 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:49:18.279662 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2665c9e8_5b6c_4941_a247_9acab76dc9f4.slice/crio-630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586 WatchSource:0}: Error finding container 630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586: Status 404 returned error can't find the container with id 630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586 Apr 22 18:49:18.633080 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.633044 2569 generic.go:358] "Generic (PLEG): container finished" podID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerID="e45f38c3f6bc7312f47e508c481b1de15968cc5fc690d0ea552a903b57fc2738" exitCode=0 Apr 22 18:49:18.633260 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.633094 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" event={"ID":"2665c9e8-5b6c-4941-a247-9acab76dc9f4","Type":"ContainerDied","Data":"e45f38c3f6bc7312f47e508c481b1de15968cc5fc690d0ea552a903b57fc2738"} Apr 22 18:49:18.633260 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:18.633128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" event={"ID":"2665c9e8-5b6c-4941-a247-9acab76dc9f4","Type":"ContainerStarted","Data":"630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586"} Apr 22 18:49:19.549539 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.549499 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t"] Apr 22 18:49:19.552746 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.552698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.556858 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.556832 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:49:19.557001 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.556866 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:49:19.557303 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.557287 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-7nxpj\"" Apr 22 18:49:19.557588 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.557571 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:49:19.557646 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.557603 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:49:19.579280 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.576889 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t"] Apr 22 18:49:19.607005 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.606968 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn52d\" (UniqueName: \"kubernetes.io/projected/67630851-6043-4d26-9507-020c13e93bc6-kube-api-access-rn52d\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.607005 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.607012 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.607229 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.607049 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.637612 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.637578 2569 generic.go:358] "Generic (PLEG): container finished" podID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerID="5e01908ffc9d6f86926ccc2e5472a2e4d0fb232b7fe6aaf536aa1a4f34621c81" exitCode=0 Apr 22 18:49:19.637806 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.637674 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" event={"ID":"2665c9e8-5b6c-4941-a247-9acab76dc9f4","Type":"ContainerDied","Data":"5e01908ffc9d6f86926ccc2e5472a2e4d0fb232b7fe6aaf536aa1a4f34621c81"} Apr 22 18:49:19.708273 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.708240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.708424 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.708340 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn52d\" (UniqueName: \"kubernetes.io/projected/67630851-6043-4d26-9507-020c13e93bc6-kube-api-access-rn52d\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.708424 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.708371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.711253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.711226 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.711375 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.711279 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/67630851-6043-4d26-9507-020c13e93bc6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.719039 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.719013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn52d\" (UniqueName: \"kubernetes.io/projected/67630851-6043-4d26-9507-020c13e93bc6-kube-api-access-rn52d\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t\" (UID: \"67630851-6043-4d26-9507-020c13e93bc6\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:19.875890 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:19.875856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:20.012461 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:20.012422 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t"] Apr 22 18:49:20.016527 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:49:20.016493 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67630851_6043_4d26_9507_020c13e93bc6.slice/crio-6cc95d4a32e74b3d5411ee523d00909d2a384760d24abb2f46d9d0e04f035d2d WatchSource:0}: Error finding container 6cc95d4a32e74b3d5411ee523d00909d2a384760d24abb2f46d9d0e04f035d2d: Status 404 returned error can't find the container with id 6cc95d4a32e74b3d5411ee523d00909d2a384760d24abb2f46d9d0e04f035d2d Apr 22 18:49:20.645055 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:20.644856 2569 generic.go:358] "Generic (PLEG): container finished" podID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerID="f91190e15f6d2ded19bd688df889389a684e38d11e8891d6d5f8a3534109f949" exitCode=0 Apr 22 18:49:20.645055 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:20.645008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" event={"ID":"2665c9e8-5b6c-4941-a247-9acab76dc9f4","Type":"ContainerDied","Data":"f91190e15f6d2ded19bd688df889389a684e38d11e8891d6d5f8a3534109f949"} Apr 22 18:49:20.647033 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:20.646987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" event={"ID":"67630851-6043-4d26-9507-020c13e93bc6","Type":"ContainerStarted","Data":"6cc95d4a32e74b3d5411ee523d00909d2a384760d24abb2f46d9d0e04f035d2d"} Apr 22 18:49:22.478186 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.478160 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:22.533833 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.533806 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle\") pod \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " Apr 22 18:49:22.533999 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.533880 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp8cd\" (UniqueName: \"kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd\") pod \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " Apr 22 18:49:22.533999 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.533960 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util\") pod \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\" (UID: \"2665c9e8-5b6c-4941-a247-9acab76dc9f4\") " Apr 22 18:49:22.535336 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.535301 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle" (OuterVolumeSpecName: "bundle") pod "2665c9e8-5b6c-4941-a247-9acab76dc9f4" (UID: "2665c9e8-5b6c-4941-a247-9acab76dc9f4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:22.536520 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.536465 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd" (OuterVolumeSpecName: "kube-api-access-pp8cd") pod "2665c9e8-5b6c-4941-a247-9acab76dc9f4" (UID: "2665c9e8-5b6c-4941-a247-9acab76dc9f4"). InnerVolumeSpecName "kube-api-access-pp8cd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:22.540966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.540926 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util" (OuterVolumeSpecName: "util") pod "2665c9e8-5b6c-4941-a247-9acab76dc9f4" (UID: "2665c9e8-5b6c-4941-a247-9acab76dc9f4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:22.635380 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.635344 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp8cd\" (UniqueName: \"kubernetes.io/projected/2665c9e8-5b6c-4941-a247-9acab76dc9f4-kube-api-access-pp8cd\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:22.635380 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.635375 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:22.635380 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.635385 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2665c9e8-5b6c-4941-a247-9acab76dc9f4-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:22.658677 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.658649 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" Apr 22 18:49:22.658883 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.658643 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9j8cpv" event={"ID":"2665c9e8-5b6c-4941-a247-9acab76dc9f4","Type":"ContainerDied","Data":"630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586"} Apr 22 18:49:22.658883 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.658791 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630044c27f0c129a291e9620dd32949208686574b93c78ff9ea4509f29f39586" Apr 22 18:49:22.660358 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.660329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" event={"ID":"67630851-6043-4d26-9507-020c13e93bc6","Type":"ContainerStarted","Data":"0b7a7fbd1d3f5f0afd7e4e31d1bf298cda646fdf847f5f2d3ebe491328f126aa"} Apr 22 18:49:22.660493 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.660446 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:22.682447 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:22.682390 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" podStartSLOduration=1.190348772 podStartE2EDuration="3.682376327s" podCreationTimestamp="2026-04-22 18:49:19 +0000 UTC" firstStartedPulling="2026-04-22 18:49:20.018574919 +0000 UTC m=+344.155919313" lastFinishedPulling="2026-04-22 18:49:22.510602463 +0000 UTC m=+346.647946868" observedRunningTime="2026-04-22 18:49:22.680581833 +0000 UTC m=+346.817926245" watchObservedRunningTime="2026-04-22 18:49:22.682376327 +0000 UTC m=+346.819720740" Apr 22 18:49:25.282039 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.281999 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5"] Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282337 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="extract" Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282348 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="extract" Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282368 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="util" Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282374 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="util" Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282382 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="pull" Apr 22 18:49:25.282419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282387 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="pull" Apr 22 18:49:25.282599 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.282435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="2665c9e8-5b6c-4941-a247-9acab76dc9f4" containerName="extract" Apr 22 18:49:25.284947 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.284929 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.289168 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.289132 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:49:25.289347 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.289174 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:49:25.289680 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.289658 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-4w9pb\"" Apr 22 18:49:25.289809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.289747 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:49:25.289969 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.289952 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:49:25.290360 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.290342 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:49:25.295593 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.295567 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5"] Apr 22 18:49:25.361534 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.361494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bb3fe25a-e9a2-48dd-b291-777a670e942d-manager-config\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.361534 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.361531 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp8qc\" (UniqueName: \"kubernetes.io/projected/bb3fe25a-e9a2-48dd-b291-777a670e942d-kube-api-access-dp8qc\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.361812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.361616 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.361812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.361740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-metrics-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.462538 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.462500 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-metrics-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.462750 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.462557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bb3fe25a-e9a2-48dd-b291-777a670e942d-manager-config\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.462750 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.462583 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dp8qc\" (UniqueName: \"kubernetes.io/projected/bb3fe25a-e9a2-48dd-b291-777a670e942d-kube-api-access-dp8qc\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.462750 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.462628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.463303 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.463275 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/bb3fe25a-e9a2-48dd-b291-777a670e942d-manager-config\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.465176 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.465157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.465322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.465291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/bb3fe25a-e9a2-48dd-b291-777a670e942d-metrics-cert\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.472996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.472972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp8qc\" (UniqueName: \"kubernetes.io/projected/bb3fe25a-e9a2-48dd-b291-777a670e942d-kube-api-access-dp8qc\") pod \"lws-controller-manager-59544c8f7-bbmw5\" (UID: \"bb3fe25a-e9a2-48dd-b291-777a670e942d\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.596021 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.595982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:25.741037 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:25.740982 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5"] Apr 22 18:49:25.743441 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:49:25.743403 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb3fe25a_e9a2_48dd_b291_777a670e942d.slice/crio-ceed125b34f4c11a78391f8c4918033c951ba5a0669ffcfe7b9b18efb076cec9 WatchSource:0}: Error finding container ceed125b34f4c11a78391f8c4918033c951ba5a0669ffcfe7b9b18efb076cec9: Status 404 returned error can't find the container with id ceed125b34f4c11a78391f8c4918033c951ba5a0669ffcfe7b9b18efb076cec9 Apr 22 18:49:26.675872 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:26.675835 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" event={"ID":"bb3fe25a-e9a2-48dd-b291-777a670e942d","Type":"ContainerStarted","Data":"ceed125b34f4c11a78391f8c4918033c951ba5a0669ffcfe7b9b18efb076cec9"} Apr 22 18:49:27.681509 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:27.681407 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" event={"ID":"bb3fe25a-e9a2-48dd-b291-777a670e942d","Type":"ContainerStarted","Data":"73b3f2fafbc1fee06e41ce8215074f6aabd556aa5572dc169fec8bb7c200c510"} Apr 22 18:49:27.681995 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:27.681512 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:27.699739 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:27.699668 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" podStartSLOduration=1.084538207 podStartE2EDuration="2.699653457s" podCreationTimestamp="2026-04-22 18:49:25 +0000 UTC" firstStartedPulling="2026-04-22 18:49:25.745306597 +0000 UTC m=+349.882650988" lastFinishedPulling="2026-04-22 18:49:27.360421844 +0000 UTC m=+351.497766238" observedRunningTime="2026-04-22 18:49:27.697078126 +0000 UTC m=+351.834422575" watchObservedRunningTime="2026-04-22 18:49:27.699653457 +0000 UTC m=+351.836997870" Apr 22 18:49:33.667194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:33.667162 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t" Apr 22 18:49:36.671509 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.671423 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j"] Apr 22 18:49:36.676113 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.676091 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.679490 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.679466 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:49:36.679490 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.679481 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:49:36.680191 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.680171 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:49:36.691334 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.691309 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j"] Apr 22 18:49:36.764756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.764690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8gdj\" (UniqueName: \"kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.764926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.764776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.764926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.764823 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.865967 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.865927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gdj\" (UniqueName: \"kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.865967 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.865970 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.866216 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.865989 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.866405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.866385 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.866405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.866399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.874422 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.874394 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8gdj\" (UniqueName: \"kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:36.984978 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:36.984878 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:37.129960 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:37.129903 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j"] Apr 22 18:49:37.132068 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:49:37.132039 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c40861_3f00_41f3_b9aa_895c3df8095a.slice/crio-76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177 WatchSource:0}: Error finding container 76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177: Status 404 returned error can't find the container with id 76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177 Apr 22 18:49:37.719521 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:37.719489 2569 generic.go:358] "Generic (PLEG): container finished" podID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerID="62f7c9962c37d963212eac0ee962a68850ad9a937d03b50725dcf45e345de82e" exitCode=0 Apr 22 18:49:37.719935 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:37.719549 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" event={"ID":"80c40861-3f00-41f3-b9aa-895c3df8095a","Type":"ContainerDied","Data":"62f7c9962c37d963212eac0ee962a68850ad9a937d03b50725dcf45e345de82e"} Apr 22 18:49:37.719935 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:37.719569 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" event={"ID":"80c40861-3f00-41f3-b9aa-895c3df8095a","Type":"ContainerStarted","Data":"76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177"} Apr 22 18:49:38.686859 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:38.686827 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-bbmw5" Apr 22 18:49:38.725068 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:38.724962 2569 generic.go:358] "Generic (PLEG): container finished" podID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerID="42b8d54a7bba91af88ccf620a9700db09e3da6ac39475c0ec61c294428ec536a" exitCode=0 Apr 22 18:49:38.725453 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:38.725057 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" event={"ID":"80c40861-3f00-41f3-b9aa-895c3df8095a","Type":"ContainerDied","Data":"42b8d54a7bba91af88ccf620a9700db09e3da6ac39475c0ec61c294428ec536a"} Apr 22 18:49:39.730823 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:39.730776 2569 generic.go:358] "Generic (PLEG): container finished" podID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerID="e3be681a2d66d1145b1faee685760a0909b4eb149cf2bff85662520ca1cb3fbf" exitCode=0 Apr 22 18:49:39.731244 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:39.730905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" event={"ID":"80c40861-3f00-41f3-b9aa-895c3df8095a","Type":"ContainerDied","Data":"e3be681a2d66d1145b1faee685760a0909b4eb149cf2bff85662520ca1cb3fbf"} Apr 22 18:49:40.877581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:40.877556 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:41.003895 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.003800 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle\") pod \"80c40861-3f00-41f3-b9aa-895c3df8095a\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " Apr 22 18:49:41.003895 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.003881 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8gdj\" (UniqueName: \"kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj\") pod \"80c40861-3f00-41f3-b9aa-895c3df8095a\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " Apr 22 18:49:41.004078 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.003906 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util\") pod \"80c40861-3f00-41f3-b9aa-895c3df8095a\" (UID: \"80c40861-3f00-41f3-b9aa-895c3df8095a\") " Apr 22 18:49:41.004756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.004698 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle" (OuterVolumeSpecName: "bundle") pod "80c40861-3f00-41f3-b9aa-895c3df8095a" (UID: "80c40861-3f00-41f3-b9aa-895c3df8095a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:41.005993 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.005968 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj" (OuterVolumeSpecName: "kube-api-access-k8gdj") pod "80c40861-3f00-41f3-b9aa-895c3df8095a" (UID: "80c40861-3f00-41f3-b9aa-895c3df8095a"). InnerVolumeSpecName "kube-api-access-k8gdj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:41.011763 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.011699 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util" (OuterVolumeSpecName: "util") pod "80c40861-3f00-41f3-b9aa-895c3df8095a" (UID: "80c40861-3f00-41f3-b9aa-895c3df8095a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:41.105452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.105419 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:41.105452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.105449 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8gdj\" (UniqueName: \"kubernetes.io/projected/80c40861-3f00-41f3-b9aa-895c3df8095a-kube-api-access-k8gdj\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:41.105452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.105459 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80c40861-3f00-41f3-b9aa-895c3df8095a-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:49:41.740296 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.740255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" event={"ID":"80c40861-3f00-41f3-b9aa-895c3df8095a","Type":"ContainerDied","Data":"76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177"} Apr 22 18:49:41.740296 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.740293 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76234f8c80acb1ec0c122076dc6e577be780090871c74ba7cd7d06733dfdc177" Apr 22 18:49:41.740296 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:41.740300 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48359vg7j" Apr 22 18:49:51.456085 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456046 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds"] Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456387 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="pull" Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456400 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="pull" Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456412 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="extract" Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456421 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="extract" Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456432 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="util" Apr 22 18:49:51.456458 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456437 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="util" Apr 22 18:49:51.456640 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.456485 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="80c40861-3f00-41f3-b9aa-895c3df8095a" containerName="extract" Apr 22 18:49:51.459834 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.459807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.469981 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.469950 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:49:51.470313 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.470294 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c5ljz\"" Apr 22 18:49:51.470858 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.470843 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:49:51.477762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.477737 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds"] Apr 22 18:49:51.498482 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.498442 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.498847 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.498817 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfx7w\" (UniqueName: \"kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.499024 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.499005 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.599569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.599535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.599780 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.599595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfx7w\" (UniqueName: \"kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.599780 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.599624 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.599927 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.599909 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.599966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.599956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.614596 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.614564 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfx7w\" (UniqueName: \"kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.768962 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.768870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:49:51.943773 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:51.943740 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds"] Apr 22 18:49:52.784937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:52.782426 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerID="efca3ab00b90d1d8809134636a1c49abda5ec722495f181344bf4d84cf46db36" exitCode=0 Apr 22 18:49:52.784937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:52.782478 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" event={"ID":"5d7cd3f3-2793-4ad5-968d-7453afe41c12","Type":"ContainerDied","Data":"efca3ab00b90d1d8809134636a1c49abda5ec722495f181344bf4d84cf46db36"} Apr 22 18:49:52.784937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:49:52.782524 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" event={"ID":"5d7cd3f3-2793-4ad5-968d-7453afe41c12","Type":"ContainerStarted","Data":"39c082fa09977f0e0bba231dbbecb33c1a9669be6142b6681418715b16f60913"} Apr 22 18:50:03.820813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:03.820778 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerID="e23b61a557dc213ef7cbc58dcd040883784eee688fdd8ef3944129c259c27ef2" exitCode=0 Apr 22 18:50:03.821324 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:03.820859 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" event={"ID":"5d7cd3f3-2793-4ad5-968d-7453afe41c12","Type":"ContainerDied","Data":"e23b61a557dc213ef7cbc58dcd040883784eee688fdd8ef3944129c259c27ef2"} Apr 22 18:50:04.832675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:04.832640 2569 generic.go:358] "Generic (PLEG): container finished" podID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerID="4ffddcdf821d510a9399a5dc4c9f35f0517937697395ed1945bf5360512cff2d" exitCode=0 Apr 22 18:50:04.833160 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:04.832735 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" event={"ID":"5d7cd3f3-2793-4ad5-968d-7453afe41c12","Type":"ContainerDied","Data":"4ffddcdf821d510a9399a5dc4c9f35f0517937697395ed1945bf5360512cff2d"} Apr 22 18:50:05.960933 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:05.960908 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:50:06.031599 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.031565 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle\") pod \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " Apr 22 18:50:06.031813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.031629 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util\") pod \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " Apr 22 18:50:06.031813 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.031667 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfx7w\" (UniqueName: \"kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w\") pod \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\" (UID: \"5d7cd3f3-2793-4ad5-968d-7453afe41c12\") " Apr 22 18:50:06.032609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.032579 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle" (OuterVolumeSpecName: "bundle") pod "5d7cd3f3-2793-4ad5-968d-7453afe41c12" (UID: "5d7cd3f3-2793-4ad5-968d-7453afe41c12"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:06.033831 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.033799 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w" (OuterVolumeSpecName: "kube-api-access-gfx7w") pod "5d7cd3f3-2793-4ad5-968d-7453afe41c12" (UID: "5d7cd3f3-2793-4ad5-968d-7453afe41c12"). InnerVolumeSpecName "kube-api-access-gfx7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:06.036402 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.036377 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util" (OuterVolumeSpecName: "util") pod "5d7cd3f3-2793-4ad5-968d-7453afe41c12" (UID: "5d7cd3f3-2793-4ad5-968d-7453afe41c12"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:06.132466 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.132361 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:50:06.132466 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.132408 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5d7cd3f3-2793-4ad5-968d-7453afe41c12-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:50:06.132466 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.132419 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfx7w\" (UniqueName: \"kubernetes.io/projected/5d7cd3f3-2793-4ad5-968d-7453afe41c12-kube-api-access-gfx7w\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:50:06.842493 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.842460 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" event={"ID":"5d7cd3f3-2793-4ad5-968d-7453afe41c12","Type":"ContainerDied","Data":"39c082fa09977f0e0bba231dbbecb33c1a9669be6142b6681418715b16f60913"} Apr 22 18:50:06.842493 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.842493 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39c082fa09977f0e0bba231dbbecb33c1a9669be6142b6681418715b16f60913" Apr 22 18:50:06.842767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:06.842514 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebvv4ds" Apr 22 18:50:38.327739 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.327676 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nx5cw"] Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328065 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="extract" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328077 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="extract" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328096 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="util" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328102 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="util" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328109 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="pull" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328115 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="pull" Apr 22 18:50:38.328215 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.328169 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d7cd3f3-2793-4ad5-968d-7453afe41c12" containerName="extract" Apr 22 18:50:38.332294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.332275 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:38.335618 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.335591 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bltkv\"" Apr 22 18:50:38.336653 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.336631 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:50:38.336811 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.336669 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:50:38.340823 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.340799 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nx5cw"] Apr 22 18:50:38.409803 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.409767 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khx9r\" (UniqueName: \"kubernetes.io/projected/528d1f89-18bc-4e58-a2e9-6ccc58896e18-kube-api-access-khx9r\") pod \"kuadrant-operator-catalog-nx5cw\" (UID: \"528d1f89-18bc-4e58-a2e9-6ccc58896e18\") " pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:38.511005 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.510973 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khx9r\" (UniqueName: \"kubernetes.io/projected/528d1f89-18bc-4e58-a2e9-6ccc58896e18-kube-api-access-khx9r\") pod \"kuadrant-operator-catalog-nx5cw\" (UID: \"528d1f89-18bc-4e58-a2e9-6ccc58896e18\") " pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:38.525610 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.525539 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khx9r\" (UniqueName: \"kubernetes.io/projected/528d1f89-18bc-4e58-a2e9-6ccc58896e18-kube-api-access-khx9r\") pod \"kuadrant-operator-catalog-nx5cw\" (UID: \"528d1f89-18bc-4e58-a2e9-6ccc58896e18\") " pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:38.643234 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.643142 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:38.786840 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.786809 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nx5cw"] Apr 22 18:50:38.788485 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:50:38.788455 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod528d1f89_18bc_4e58_a2e9_6ccc58896e18.slice/crio-c1cdd8e7ea20caf57ef2b71c11851595caf72f287f0e1520887054c91628541b WatchSource:0}: Error finding container c1cdd8e7ea20caf57ef2b71c11851595caf72f287f0e1520887054c91628541b: Status 404 returned error can't find the container with id c1cdd8e7ea20caf57ef2b71c11851595caf72f287f0e1520887054c91628541b Apr 22 18:50:38.957042 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:38.956953 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" event={"ID":"528d1f89-18bc-4e58-a2e9-6ccc58896e18","Type":"ContainerStarted","Data":"c1cdd8e7ea20caf57ef2b71c11851595caf72f287f0e1520887054c91628541b"} Apr 22 18:50:40.966603 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:40.966565 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" event={"ID":"528d1f89-18bc-4e58-a2e9-6ccc58896e18","Type":"ContainerStarted","Data":"ff7cc28a13f8ef8a09789480c0e0645c396547fb7cdd6fdd1be375f42a426b97"} Apr 22 18:50:40.986937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:40.986881 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" podStartSLOduration=0.928993693 podStartE2EDuration="2.986861237s" podCreationTimestamp="2026-04-22 18:50:38 +0000 UTC" firstStartedPulling="2026-04-22 18:50:38.789807522 +0000 UTC m=+422.927151912" lastFinishedPulling="2026-04-22 18:50:40.847675055 +0000 UTC m=+424.985019456" observedRunningTime="2026-04-22 18:50:40.984478959 +0000 UTC m=+425.121823371" watchObservedRunningTime="2026-04-22 18:50:40.986861237 +0000 UTC m=+425.124205649" Apr 22 18:50:43.329405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.329364 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8"] Apr 22 18:50:43.333162 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.333135 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.336255 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.336226 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:50:43.336408 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.336261 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:50:43.336408 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.336283 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-9x5hq\"" Apr 22 18:50:43.337008 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.336991 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:50:43.344234 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.344207 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8"] Apr 22 18:50:43.348976 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.348942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.348986 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349175 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349166 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e111bc77-1baa-4309-845c-d1ee770f7b9d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349472 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349372 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.349472 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.349411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mbg8\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-kube-api-access-8mbg8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450328 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450328 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mbg8\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-kube-api-access-8mbg8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450406 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450431 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450500 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450581 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450742 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e111bc77-1baa-4309-845c-d1ee770f7b9d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.450904 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450893 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.451062 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.450991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.451236 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.451210 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.451476 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.451454 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e111bc77-1baa-4309-845c-d1ee770f7b9d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.452759 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.452736 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.453026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.453006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.463044 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.463009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.463180 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.463092 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mbg8\" (UniqueName: \"kubernetes.io/projected/e111bc77-1baa-4309-845c-d1ee770f7b9d-kube-api-access-8mbg8\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8\" (UID: \"e111bc77-1baa-4309-845c-d1ee770f7b9d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.645131 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.645023 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:43.783859 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.783826 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8"] Apr 22 18:50:43.785456 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:50:43.785411 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode111bc77_1baa_4309_845c_d1ee770f7b9d.slice/crio-839d8c78152c3400aca0e3d3edb13fd299d3c0223a16a959470960f73846e880 WatchSource:0}: Error finding container 839d8c78152c3400aca0e3d3edb13fd299d3c0223a16a959470960f73846e880: Status 404 returned error can't find the container with id 839d8c78152c3400aca0e3d3edb13fd299d3c0223a16a959470960f73846e880 Apr 22 18:50:43.979756 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:43.979644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" event={"ID":"e111bc77-1baa-4309-845c-d1ee770f7b9d","Type":"ContainerStarted","Data":"839d8c78152c3400aca0e3d3edb13fd299d3c0223a16a959470960f73846e880"} Apr 22 18:50:48.643857 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:48.643745 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:48.644308 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:48.643925 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:48.669040 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:48.669007 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:49.030862 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:49.030778 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-nx5cw" Apr 22 18:50:49.312032 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:49.311991 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:50:49.312150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:49.312068 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:50:49.312150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:49.312094 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:50:50.003555 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:50.003520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" event={"ID":"e111bc77-1baa-4309-845c-d1ee770f7b9d","Type":"ContainerStarted","Data":"4d2212daeaa85972897da58e5bb5c6449e721f132d1eed6fd52f9c4ad80abd27"} Apr 22 18:50:50.028192 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:50.028138 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" podStartSLOduration=1.5039825759999998 podStartE2EDuration="7.028123644s" podCreationTimestamp="2026-04-22 18:50:43 +0000 UTC" firstStartedPulling="2026-04-22 18:50:43.787579957 +0000 UTC m=+427.924924353" lastFinishedPulling="2026-04-22 18:50:49.311721014 +0000 UTC m=+433.449065421" observedRunningTime="2026-04-22 18:50:50.026403012 +0000 UTC m=+434.163747437" watchObservedRunningTime="2026-04-22 18:50:50.028123644 +0000 UTC m=+434.165468056" Apr 22 18:50:50.645762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:50.645697 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:50.650519 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:50.650489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:51.007244 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:51.007145 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:50:51.008204 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:50:51.008187 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8" Apr 22 18:51:07.254260 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.254217 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd"] Apr 22 18:51:07.262575 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.262546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.265383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.265353 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wnkm8\"" Apr 22 18:51:07.266806 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.266774 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd"] Apr 22 18:51:07.351302 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.351266 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v"] Apr 22 18:51:07.355248 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.355226 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.364099 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.364069 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v"] Apr 22 18:51:07.371801 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.371773 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.371972 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.371920 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5qk2\" (UniqueName: \"kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.372026 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.371984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.472916 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.472859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5qk2\" (UniqueName: \"kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.472916 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.472918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.473193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.472979 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.473193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.473016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b87pd\" (UniqueName: \"kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.473193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.473046 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.473193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.473174 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.473357 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.473339 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.473392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.473369 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.482688 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.482657 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5qk2\" (UniqueName: \"kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.573799 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.573757 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:07.574012 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.573986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.574076 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.574030 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b87pd\" (UniqueName: \"kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.574117 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.574090 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.574862 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.574424 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.574862 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.574435 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.582497 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.582470 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b87pd\" (UniqueName: \"kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.652949 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.652915 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz"] Apr 22 18:51:07.658627 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.658600 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.666194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.665684 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:07.667144 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.666986 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz"] Apr 22 18:51:07.721029 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.720653 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd"] Apr 22 18:51:07.723763 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:07.723703 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6693871_180e_4097_983c_c6bb8688304b.slice/crio-36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319 WatchSource:0}: Error finding container 36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319: Status 404 returned error can't find the container with id 36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319 Apr 22 18:51:07.777280 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.777207 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.777437 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.777357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.777437 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.777431 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7gcr\" (UniqueName: \"kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.812952 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.812922 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v"] Apr 22 18:51:07.814795 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:07.814761 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218a6a25_5455_467f_8b36_2e3a9004d2f9.slice/crio-742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a WatchSource:0}: Error finding container 742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a: Status 404 returned error can't find the container with id 742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a Apr 22 18:51:07.877883 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.877838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.878060 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.877904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7gcr\" (UniqueName: \"kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.878060 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.877941 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.878253 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.878231 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.878293 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.878255 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.887385 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.887358 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7gcr\" (UniqueName: \"kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:07.972483 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:07.972448 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:08.047106 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.047070 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf"] Apr 22 18:51:08.051306 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.051286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.061383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.061345 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf"] Apr 22 18:51:08.072788 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.072755 2569 generic.go:358] "Generic (PLEG): container finished" podID="a6693871-180e-4097-983c-c6bb8688304b" containerID="5d59b151a80533b05dc3a82924ac6239ca755fe0427fd60e844b06c275858236" exitCode=0 Apr 22 18:51:08.072985 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.072838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" event={"ID":"a6693871-180e-4097-983c-c6bb8688304b","Type":"ContainerDied","Data":"5d59b151a80533b05dc3a82924ac6239ca755fe0427fd60e844b06c275858236"} Apr 22 18:51:08.072985 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.072884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" event={"ID":"a6693871-180e-4097-983c-c6bb8688304b","Type":"ContainerStarted","Data":"36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319"} Apr 22 18:51:08.075644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.075479 2569 generic.go:358] "Generic (PLEG): container finished" podID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerID="77a48a90b11ab67c00c3333d47cd6f5d0e8f6af0736f7fb1e3324683fe252479" exitCode=0 Apr 22 18:51:08.075644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.075527 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" event={"ID":"218a6a25-5455-467f-8b36-2e3a9004d2f9","Type":"ContainerDied","Data":"77a48a90b11ab67c00c3333d47cd6f5d0e8f6af0736f7fb1e3324683fe252479"} Apr 22 18:51:08.075644 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.075560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" event={"ID":"218a6a25-5455-467f-8b36-2e3a9004d2f9","Type":"ContainerStarted","Data":"742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a"} Apr 22 18:51:08.115626 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.115593 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz"] Apr 22 18:51:08.179793 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.179762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcpx\" (UniqueName: \"kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.179926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.179820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.179926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.179850 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.280977 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.280930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcpx\" (UniqueName: \"kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.281382 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.280987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.281382 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.281013 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.281455 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.281415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.281455 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.281439 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.290331 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.290303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcpx\" (UniqueName: \"kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.366091 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.365991 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:08.518132 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:08.517996 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf"] Apr 22 18:51:08.521161 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:08.521134 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ecfaa96_6ca2_4448_8418_f8c06647d59a.slice/crio-a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b WatchSource:0}: Error finding container a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b: Status 404 returned error can't find the container with id a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b Apr 22 18:51:09.080993 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.080956 2569 generic.go:358] "Generic (PLEG): container finished" podID="a6693871-180e-4097-983c-c6bb8688304b" containerID="dc7ea55bb72ca10581a3eea2b4987ab68c5e0b2c20a7c11866c97909c7e7432a" exitCode=0 Apr 22 18:51:09.081201 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.081043 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" event={"ID":"a6693871-180e-4097-983c-c6bb8688304b","Type":"ContainerDied","Data":"dc7ea55bb72ca10581a3eea2b4987ab68c5e0b2c20a7c11866c97909c7e7432a"} Apr 22 18:51:09.082666 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.082636 2569 generic.go:358] "Generic (PLEG): container finished" podID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerID="853b8934929fdd21a6680c170fc4643d58d8c08f45fdd0fbb7d9ab1d48cd77d5" exitCode=0 Apr 22 18:51:09.082886 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.082729 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" event={"ID":"218a6a25-5455-467f-8b36-2e3a9004d2f9","Type":"ContainerDied","Data":"853b8934929fdd21a6680c170fc4643d58d8c08f45fdd0fbb7d9ab1d48cd77d5"} Apr 22 18:51:09.084091 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.084067 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerID="e0ba991c046907d16a6b7c2209759d6e8a5ff20717f6a12750d3147b8fcfed92" exitCode=0 Apr 22 18:51:09.084197 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.084138 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" event={"ID":"8ecfaa96-6ca2-4448-8418-f8c06647d59a","Type":"ContainerDied","Data":"e0ba991c046907d16a6b7c2209759d6e8a5ff20717f6a12750d3147b8fcfed92"} Apr 22 18:51:09.084197 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.084162 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" event={"ID":"8ecfaa96-6ca2-4448-8418-f8c06647d59a","Type":"ContainerStarted","Data":"a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b"} Apr 22 18:51:09.085737 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.085697 2569 generic.go:358] "Generic (PLEG): container finished" podID="32c07195-42f6-418b-bcdd-48517730c21b" containerID="7771064c32eba54d279e7da2b12c138524a9a87993c701c63de27283f62909f2" exitCode=0 Apr 22 18:51:09.085836 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.085743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" event={"ID":"32c07195-42f6-418b-bcdd-48517730c21b","Type":"ContainerDied","Data":"7771064c32eba54d279e7da2b12c138524a9a87993c701c63de27283f62909f2"} Apr 22 18:51:09.085836 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:09.085776 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" event={"ID":"32c07195-42f6-418b-bcdd-48517730c21b","Type":"ContainerStarted","Data":"bdabb5cfffecf66a5d07f34c16448e9eff853770118ff5a2adbdc4bcf27dfe8f"} Apr 22 18:51:10.092696 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.092663 2569 generic.go:358] "Generic (PLEG): container finished" podID="a6693871-180e-4097-983c-c6bb8688304b" containerID="8d5f5d6b83414b7a7a9e4e61ad9d71e53883f9687f9caaeef83b0fa5bc1eb6ff" exitCode=0 Apr 22 18:51:10.093180 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.092747 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" event={"ID":"a6693871-180e-4097-983c-c6bb8688304b","Type":"ContainerDied","Data":"8d5f5d6b83414b7a7a9e4e61ad9d71e53883f9687f9caaeef83b0fa5bc1eb6ff"} Apr 22 18:51:10.094996 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.094957 2569 generic.go:358] "Generic (PLEG): container finished" podID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerID="9d29c498f7345e7d8738f3b4c42c1564258576d0204c6451bbf1232320fa78fc" exitCode=0 Apr 22 18:51:10.095169 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.095044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" event={"ID":"218a6a25-5455-467f-8b36-2e3a9004d2f9","Type":"ContainerDied","Data":"9d29c498f7345e7d8738f3b4c42c1564258576d0204c6451bbf1232320fa78fc"} Apr 22 18:51:10.096972 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.096940 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerID="799945c33f9de32413682351147ad4a330285350239b091c9934bb2b157d8278" exitCode=0 Apr 22 18:51:10.097116 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:10.096991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" event={"ID":"8ecfaa96-6ca2-4448-8418-f8c06647d59a","Type":"ContainerDied","Data":"799945c33f9de32413682351147ad4a330285350239b091c9934bb2b157d8278"} Apr 22 18:51:11.102957 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.102919 2569 generic.go:358] "Generic (PLEG): container finished" podID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerID="b74d5256e08dd94b02e433ce2e773a616d6f66d77ad803be8bd0f1c4f4b7e974" exitCode=0 Apr 22 18:51:11.103465 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.103001 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" event={"ID":"8ecfaa96-6ca2-4448-8418-f8c06647d59a","Type":"ContainerDied","Data":"b74d5256e08dd94b02e433ce2e773a616d6f66d77ad803be8bd0f1c4f4b7e974"} Apr 22 18:51:11.104685 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.104656 2569 generic.go:358] "Generic (PLEG): container finished" podID="32c07195-42f6-418b-bcdd-48517730c21b" containerID="c4c51c681bb882f22ee74adb576dec3bfe603dee0e14b7e75e54f4e30f765225" exitCode=0 Apr 22 18:51:11.104833 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.104740 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" event={"ID":"32c07195-42f6-418b-bcdd-48517730c21b","Type":"ContainerDied","Data":"c4c51c681bb882f22ee74adb576dec3bfe603dee0e14b7e75e54f4e30f765225"} Apr 22 18:51:11.250984 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.250959 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:11.274390 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.274366 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:11.406438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406335 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util\") pod \"218a6a25-5455-467f-8b36-2e3a9004d2f9\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " Apr 22 18:51:11.406438 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406422 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b87pd\" (UniqueName: \"kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd\") pod \"218a6a25-5455-467f-8b36-2e3a9004d2f9\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " Apr 22 18:51:11.406683 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406458 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5qk2\" (UniqueName: \"kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2\") pod \"a6693871-180e-4097-983c-c6bb8688304b\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " Apr 22 18:51:11.406683 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406481 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util\") pod \"a6693871-180e-4097-983c-c6bb8688304b\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " Apr 22 18:51:11.406683 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406507 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle\") pod \"a6693871-180e-4097-983c-c6bb8688304b\" (UID: \"a6693871-180e-4097-983c-c6bb8688304b\") " Apr 22 18:51:11.406683 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.406584 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle\") pod \"218a6a25-5455-467f-8b36-2e3a9004d2f9\" (UID: \"218a6a25-5455-467f-8b36-2e3a9004d2f9\") " Apr 22 18:51:11.407267 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.407236 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle" (OuterVolumeSpecName: "bundle") pod "218a6a25-5455-467f-8b36-2e3a9004d2f9" (UID: "218a6a25-5455-467f-8b36-2e3a9004d2f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:11.407383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.407361 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle" (OuterVolumeSpecName: "bundle") pod "a6693871-180e-4097-983c-c6bb8688304b" (UID: "a6693871-180e-4097-983c-c6bb8688304b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:11.408792 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.408768 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd" (OuterVolumeSpecName: "kube-api-access-b87pd") pod "218a6a25-5455-467f-8b36-2e3a9004d2f9" (UID: "218a6a25-5455-467f-8b36-2e3a9004d2f9"). InnerVolumeSpecName "kube-api-access-b87pd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:11.409027 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.408996 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2" (OuterVolumeSpecName: "kube-api-access-j5qk2") pod "a6693871-180e-4097-983c-c6bb8688304b" (UID: "a6693871-180e-4097-983c-c6bb8688304b"). InnerVolumeSpecName "kube-api-access-j5qk2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:11.411359 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.411334 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util" (OuterVolumeSpecName: "util") pod "218a6a25-5455-467f-8b36-2e3a9004d2f9" (UID: "218a6a25-5455-467f-8b36-2e3a9004d2f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:11.412736 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.412699 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util" (OuterVolumeSpecName: "util") pod "a6693871-180e-4097-983c-c6bb8688304b" (UID: "a6693871-180e-4097-983c-c6bb8688304b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:11.507968 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.507932 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:11.507968 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.507966 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/218a6a25-5455-467f-8b36-2e3a9004d2f9-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:11.508161 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.507980 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b87pd\" (UniqueName: \"kubernetes.io/projected/218a6a25-5455-467f-8b36-2e3a9004d2f9-kube-api-access-b87pd\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:11.508161 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.507998 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5qk2\" (UniqueName: \"kubernetes.io/projected/a6693871-180e-4097-983c-c6bb8688304b-kube-api-access-j5qk2\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:11.508161 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.508011 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:11.508161 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:11.508022 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a6693871-180e-4097-983c-c6bb8688304b-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:12.113853 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.113819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" event={"ID":"a6693871-180e-4097-983c-c6bb8688304b","Type":"ContainerDied","Data":"36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319"} Apr 22 18:51:12.113853 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.113839 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd" Apr 22 18:51:12.113853 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.113858 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36f0182f3c3d7c6f0c9b98e06b31c4ee07068252cb545a20a98276af69c30319" Apr 22 18:51:12.115497 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.115471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" event={"ID":"218a6a25-5455-467f-8b36-2e3a9004d2f9","Type":"ContainerDied","Data":"742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a"} Apr 22 18:51:12.115692 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.115501 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742074a54d18c504b38622baecb121e05683cc0595dc37adebf18d2938012b1a" Apr 22 18:51:12.115692 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.115509 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v" Apr 22 18:51:12.117393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.117365 2569 generic.go:358] "Generic (PLEG): container finished" podID="32c07195-42f6-418b-bcdd-48517730c21b" containerID="84d00d994471cb26512e5e2f53d566457c2e4ee7cc9a97e12495dae433dd38a6" exitCode=0 Apr 22 18:51:12.117506 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.117398 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" event={"ID":"32c07195-42f6-418b-bcdd-48517730c21b","Type":"ContainerDied","Data":"84d00d994471cb26512e5e2f53d566457c2e4ee7cc9a97e12495dae433dd38a6"} Apr 22 18:51:12.246839 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.246811 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:12.315371 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.315340 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle\") pod \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " Apr 22 18:51:12.315561 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.315435 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcpx\" (UniqueName: \"kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx\") pod \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " Apr 22 18:51:12.315561 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.315477 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util\") pod \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\" (UID: \"8ecfaa96-6ca2-4448-8418-f8c06647d59a\") " Apr 22 18:51:12.315993 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.315963 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle" (OuterVolumeSpecName: "bundle") pod "8ecfaa96-6ca2-4448-8418-f8c06647d59a" (UID: "8ecfaa96-6ca2-4448-8418-f8c06647d59a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:12.318078 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.318050 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx" (OuterVolumeSpecName: "kube-api-access-rxcpx") pod "8ecfaa96-6ca2-4448-8418-f8c06647d59a" (UID: "8ecfaa96-6ca2-4448-8418-f8c06647d59a"). InnerVolumeSpecName "kube-api-access-rxcpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:12.321791 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.321750 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util" (OuterVolumeSpecName: "util") pod "8ecfaa96-6ca2-4448-8418-f8c06647d59a" (UID: "8ecfaa96-6ca2-4448-8418-f8c06647d59a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:12.416767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.416646 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxcpx\" (UniqueName: \"kubernetes.io/projected/8ecfaa96-6ca2-4448-8418-f8c06647d59a-kube-api-access-rxcpx\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:12.416767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.416678 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:12.416767 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:12.416687 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ecfaa96-6ca2-4448-8418-f8c06647d59a-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:13.129755 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.129700 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" event={"ID":"8ecfaa96-6ca2-4448-8418-f8c06647d59a","Type":"ContainerDied","Data":"a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b"} Apr 22 18:51:13.129755 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.129751 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf" Apr 22 18:51:13.130247 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.129755 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7733d6a56bdf9bf85c8d321a2d89ab3318c0ef284392511d8a81593bba5b23b" Apr 22 18:51:13.258321 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.258294 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:13.429266 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.429169 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util\") pod \"32c07195-42f6-418b-bcdd-48517730c21b\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " Apr 22 18:51:13.429266 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.429226 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7gcr\" (UniqueName: \"kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr\") pod \"32c07195-42f6-418b-bcdd-48517730c21b\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " Apr 22 18:51:13.429530 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.429317 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle\") pod \"32c07195-42f6-418b-bcdd-48517730c21b\" (UID: \"32c07195-42f6-418b-bcdd-48517730c21b\") " Apr 22 18:51:13.429923 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.429894 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle" (OuterVolumeSpecName: "bundle") pod "32c07195-42f6-418b-bcdd-48517730c21b" (UID: "32c07195-42f6-418b-bcdd-48517730c21b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:13.431364 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.431333 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr" (OuterVolumeSpecName: "kube-api-access-f7gcr") pod "32c07195-42f6-418b-bcdd-48517730c21b" (UID: "32c07195-42f6-418b-bcdd-48517730c21b"). InnerVolumeSpecName "kube-api-access-f7gcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:13.459263 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.459217 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util" (OuterVolumeSpecName: "util") pod "32c07195-42f6-418b-bcdd-48517730c21b" (UID: "32c07195-42f6-418b-bcdd-48517730c21b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:13.530193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.530157 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7gcr\" (UniqueName: \"kubernetes.io/projected/32c07195-42f6-418b-bcdd-48517730c21b-kube-api-access-f7gcr\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:13.530193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.530189 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:13.530193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:13.530200 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c07195-42f6-418b-bcdd-48517730c21b-util\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:51:14.135158 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:14.135121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" event={"ID":"32c07195-42f6-418b-bcdd-48517730c21b","Type":"ContainerDied","Data":"bdabb5cfffecf66a5d07f34c16448e9eff853770118ff5a2adbdc4bcf27dfe8f"} Apr 22 18:51:14.135158 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:14.135155 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz" Apr 22 18:51:14.135597 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:14.135158 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdabb5cfffecf66a5d07f34c16448e9eff853770118ff5a2adbdc4bcf27dfe8f" Apr 22 18:51:23.002380 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002340 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn"] Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002697 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002722 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002738 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="util" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002744 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="util" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002750 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002755 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002764 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002768 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002774 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="pull" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002779 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="pull" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002786 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="pull" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002792 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="pull" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002800 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002805 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="extract" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002816 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="util" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002821 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="util" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002827 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="util" Apr 22 18:51:23.002822 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002833 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="util" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002841 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="pull" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002846 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="pull" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002853 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="util" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002858 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="util" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002864 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="pull" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002870 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="pull" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002919 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="32c07195-42f6-418b-bcdd-48517730c21b" containerName="extract" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002928 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="218a6a25-5455-467f-8b36-2e3a9004d2f9" containerName="extract" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002935 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6693871-180e-4097-983c-c6bb8688304b" containerName="extract" Apr 22 18:51:23.003345 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.002942 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ecfaa96-6ca2-4448-8418-f8c06647d59a" containerName="extract" Apr 22 18:51:23.006017 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.005994 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:23.010133 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.010106 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 18:51:23.010294 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.010164 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-wbd7v\"" Apr 22 18:51:23.021009 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.020976 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn"] Apr 22 18:51:23.113970 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.113928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4cw\" (UniqueName: \"kubernetes.io/projected/d0181e7b-79bc-480d-9390-de810fdd12d5-kube-api-access-ph4cw\") pod \"dns-operator-controller-manager-648d5c98bc-xklhn\" (UID: \"d0181e7b-79bc-480d-9390-de810fdd12d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:23.214859 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.214815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4cw\" (UniqueName: \"kubernetes.io/projected/d0181e7b-79bc-480d-9390-de810fdd12d5-kube-api-access-ph4cw\") pod \"dns-operator-controller-manager-648d5c98bc-xklhn\" (UID: \"d0181e7b-79bc-480d-9390-de810fdd12d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:23.227128 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.227086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4cw\" (UniqueName: \"kubernetes.io/projected/d0181e7b-79bc-480d-9390-de810fdd12d5-kube-api-access-ph4cw\") pod \"dns-operator-controller-manager-648d5c98bc-xklhn\" (UID: \"d0181e7b-79bc-480d-9390-de810fdd12d5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:23.317150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.317114 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:23.456814 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:23.456772 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn"] Apr 22 18:51:23.460205 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:23.460171 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0181e7b_79bc_480d_9390_de810fdd12d5.slice/crio-68f4eaa22cb19ea13f60aa93e5b89805808b645a6a5b37aae5ee5bc2e8876ed1 WatchSource:0}: Error finding container 68f4eaa22cb19ea13f60aa93e5b89805808b645a6a5b37aae5ee5bc2e8876ed1: Status 404 returned error can't find the container with id 68f4eaa22cb19ea13f60aa93e5b89805808b645a6a5b37aae5ee5bc2e8876ed1 Apr 22 18:51:24.176418 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:24.176375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" event={"ID":"d0181e7b-79bc-480d-9390-de810fdd12d5","Type":"ContainerStarted","Data":"68f4eaa22cb19ea13f60aa93e5b89805808b645a6a5b37aae5ee5bc2e8876ed1"} Apr 22 18:51:25.310871 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.310827 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68bdc9c74-l88mg"] Apr 22 18:51:25.314250 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.314218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.329090 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.329049 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68bdc9c74-l88mg"] Apr 22 18:51:25.435367 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435325 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435572 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435376 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-oauth-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435572 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435572 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-trusted-ca-bundle\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435572 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmch2\" (UniqueName: \"kubernetes.io/projected/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-kube-api-access-wmch2\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435830 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-oauth-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.435830 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.435660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-service-ca\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537104 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537057 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-oauth-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-trusted-ca-bundle\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537172 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmch2\" (UniqueName: \"kubernetes.io/projected/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-kube-api-access-wmch2\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-oauth-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-service-ca\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537322 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.537995 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.537967 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-oauth-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.538110 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.538049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-service-ca\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.538110 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.538049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.538203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.538164 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-trusted-ca-bundle\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.539884 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.539854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-oauth-config\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.539969 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.539954 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-console-serving-cert\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.548433 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.548401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmch2\" (UniqueName: \"kubernetes.io/projected/6d16941a-798d-4fcf-8cd5-9fac915ba9c9-kube-api-access-wmch2\") pod \"console-68bdc9c74-l88mg\" (UID: \"6d16941a-798d-4fcf-8cd5-9fac915ba9c9\") " pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.625851 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.625809 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:25.770323 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:25.770295 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68bdc9c74-l88mg"] Apr 22 18:51:25.771844 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:25.771820 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d16941a_798d_4fcf_8cd5_9fac915ba9c9.slice/crio-c29d0b4a6b820745feb377b4b0f3066050551f57b58ec1d521f573f46d2f7653 WatchSource:0}: Error finding container c29d0b4a6b820745feb377b4b0f3066050551f57b58ec1d521f573f46d2f7653: Status 404 returned error can't find the container with id c29d0b4a6b820745feb377b4b0f3066050551f57b58ec1d521f573f46d2f7653 Apr 22 18:51:26.187811 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.187761 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bdc9c74-l88mg" event={"ID":"6d16941a-798d-4fcf-8cd5-9fac915ba9c9","Type":"ContainerStarted","Data":"fe87c8ca4575419da1faaf2d69ab8766d17d85a877f15e4aeb9aacd858c2a409"} Apr 22 18:51:26.188016 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.187810 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68bdc9c74-l88mg" event={"ID":"6d16941a-798d-4fcf-8cd5-9fac915ba9c9","Type":"ContainerStarted","Data":"c29d0b4a6b820745feb377b4b0f3066050551f57b58ec1d521f573f46d2f7653"} Apr 22 18:51:26.209419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.209361 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68bdc9c74-l88mg" podStartSLOduration=1.209343243 podStartE2EDuration="1.209343243s" podCreationTimestamp="2026-04-22 18:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:26.206399768 +0000 UTC m=+470.343744175" watchObservedRunningTime="2026-04-22 18:51:26.209343243 +0000 UTC m=+470.346687656" Apr 22 18:51:26.420257 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.420230 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg"] Apr 22 18:51:26.423642 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.423620 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:26.427210 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.427188 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-h7sk4\"" Apr 22 18:51:26.438950 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.438885 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg"] Apr 22 18:51:26.545544 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.545475 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbdvd\" (UniqueName: \"kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd\") pod \"limitador-operator-controller-manager-85c4996f8c-2zskg\" (UID: \"cc4b59ce-227f-4758-866a-77f205b25dd3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:26.647071 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.647033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbdvd\" (UniqueName: \"kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd\") pod \"limitador-operator-controller-manager-85c4996f8c-2zskg\" (UID: \"cc4b59ce-227f-4758-866a-77f205b25dd3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:26.658539 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.658505 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbdvd\" (UniqueName: \"kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd\") pod \"limitador-operator-controller-manager-85c4996f8c-2zskg\" (UID: \"cc4b59ce-227f-4758-866a-77f205b25dd3\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:26.736170 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.736078 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:26.916758 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:26.916650 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg"] Apr 22 18:51:26.917982 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:26.917938 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc4b59ce_227f_4758_866a_77f205b25dd3.slice/crio-106d1dfe507a774771f7c7fe8de7b4b7a807aeffe0dadd70f999d525313302a5 WatchSource:0}: Error finding container 106d1dfe507a774771f7c7fe8de7b4b7a807aeffe0dadd70f999d525313302a5: Status 404 returned error can't find the container with id 106d1dfe507a774771f7c7fe8de7b4b7a807aeffe0dadd70f999d525313302a5 Apr 22 18:51:27.193358 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:27.193318 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" event={"ID":"cc4b59ce-227f-4758-866a-77f205b25dd3","Type":"ContainerStarted","Data":"106d1dfe507a774771f7c7fe8de7b4b7a807aeffe0dadd70f999d525313302a5"} Apr 22 18:51:27.194920 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:27.194883 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" event={"ID":"d0181e7b-79bc-480d-9390-de810fdd12d5","Type":"ContainerStarted","Data":"ab9d7f5af7cb0aedfd29fe648813b10431420322d6eaabc0b14269adcb135d35"} Apr 22 18:51:27.195095 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:27.195014 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:27.232048 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:27.231986 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" podStartSLOduration=2.278501107 podStartE2EDuration="5.231967018s" podCreationTimestamp="2026-04-22 18:51:22 +0000 UTC" firstStartedPulling="2026-04-22 18:51:23.462178907 +0000 UTC m=+467.599523298" lastFinishedPulling="2026-04-22 18:51:26.415644804 +0000 UTC m=+470.552989209" observedRunningTime="2026-04-22 18:51:27.228485656 +0000 UTC m=+471.365830070" watchObservedRunningTime="2026-04-22 18:51:27.231967018 +0000 UTC m=+471.369311431" Apr 22 18:51:29.204982 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:29.204888 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" event={"ID":"cc4b59ce-227f-4758-866a-77f205b25dd3","Type":"ContainerStarted","Data":"8e36c34b70d689fa02fe736ac96e7bde49e61682de786b1928e902d7a0237b65"} Apr 22 18:51:29.204982 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:29.204938 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:29.224583 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:29.224527 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" podStartSLOduration=1.196652387 podStartE2EDuration="3.22451083s" podCreationTimestamp="2026-04-22 18:51:26 +0000 UTC" firstStartedPulling="2026-04-22 18:51:26.920066873 +0000 UTC m=+471.057411263" lastFinishedPulling="2026-04-22 18:51:28.947925315 +0000 UTC m=+473.085269706" observedRunningTime="2026-04-22 18:51:29.221919953 +0000 UTC m=+473.359264367" watchObservedRunningTime="2026-04-22 18:51:29.22451083 +0000 UTC m=+473.361855256" Apr 22 18:51:35.626807 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:35.626776 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:35.627267 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:35.626819 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:35.631936 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:35.631907 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:36.236799 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:36.236766 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68bdc9c74-l88mg" Apr 22 18:51:36.297621 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:36.297583 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:51:38.201795 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:38.201761 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xklhn" Apr 22 18:51:40.211320 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:40.211284 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:51:45.206381 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.206344 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q"] Apr 22 18:51:45.211363 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.211338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.213926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.213895 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 18:51:45.213926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.213902 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wnkm8\"" Apr 22 18:51:45.214150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.213909 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 18:51:45.220647 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.220616 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q"] Apr 22 18:51:45.316639 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.316590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/259d75e6-ca74-4560-aa60-697f8edb7fe1-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.316855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.316652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jp9v\" (UniqueName: \"kubernetes.io/projected/259d75e6-ca74-4560-aa60-697f8edb7fe1-kube-api-access-8jp9v\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.316855 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.316687 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.417617 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.417572 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jp9v\" (UniqueName: \"kubernetes.io/projected/259d75e6-ca74-4560-aa60-697f8edb7fe1-kube-api-access-8jp9v\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.417837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.417628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.417837 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.417692 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/259d75e6-ca74-4560-aa60-697f8edb7fe1-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.417837 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:51:45.417799 2569 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 18:51:45.417969 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:51:45.417889 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert podName:259d75e6-ca74-4560-aa60-697f8edb7fe1 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:45.91786914 +0000 UTC m=+490.055213533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-g8w4q" (UID: "259d75e6-ca74-4560-aa60-697f8edb7fe1") : secret "plugin-serving-cert" not found Apr 22 18:51:45.418352 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.418332 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/259d75e6-ca74-4560-aa60-697f8edb7fe1-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.427378 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.427357 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jp9v\" (UniqueName: \"kubernetes.io/projected/259d75e6-ca74-4560-aa60-697f8edb7fe1-kube-api-access-8jp9v\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.922798 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.922754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:45.925275 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:45.925247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/259d75e6-ca74-4560-aa60-697f8edb7fe1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g8w4q\" (UID: \"259d75e6-ca74-4560-aa60-697f8edb7fe1\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:46.121528 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:46.121483 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" Apr 22 18:51:46.263966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:46.263935 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q"] Apr 22 18:51:46.265921 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:51:46.265889 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod259d75e6_ca74_4560_aa60_697f8edb7fe1.slice/crio-ab522a89c81943436cce6445b831a57e9f048eb6c254cd56b8f3f12f9d513180 WatchSource:0}: Error finding container ab522a89c81943436cce6445b831a57e9f048eb6c254cd56b8f3f12f9d513180: Status 404 returned error can't find the container with id ab522a89c81943436cce6445b831a57e9f048eb6c254cd56b8f3f12f9d513180 Apr 22 18:51:46.275183 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:46.275153 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" event={"ID":"259d75e6-ca74-4560-aa60-697f8edb7fe1","Type":"ContainerStarted","Data":"ab522a89c81943436cce6445b831a57e9f048eb6c254cd56b8f3f12f9d513180"} Apr 22 18:51:55.985759 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:55.985697 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg"] Apr 22 18:51:55.986165 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:55.986045 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" containerName="manager" containerID="cri-o://8e36c34b70d689fa02fe736ac96e7bde49e61682de786b1928e902d7a0237b65" gracePeriod=2 Apr 22 18:51:56.004731 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.004673 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg"] Apr 22 18:51:56.035042 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.035004 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j"] Apr 22 18:51:56.035728 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.035684 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" containerName="manager" Apr 22 18:51:56.035880 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.035736 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" containerName="manager" Apr 22 18:51:56.036359 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.036335 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" containerName="manager" Apr 22 18:51:56.045057 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.045016 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:51:56.050040 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.049989 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:51:56.057988 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.057956 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j"] Apr 22 18:51:56.120285 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.120245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvv66\" (UniqueName: \"kubernetes.io/projected/8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1-kube-api-access-mvv66\") pod \"limitador-operator-controller-manager-85c4996f8c-pd27j\" (UID: \"8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:51:56.221378 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.221344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvv66\" (UniqueName: \"kubernetes.io/projected/8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1-kube-api-access-mvv66\") pod \"limitador-operator-controller-manager-85c4996f8c-pd27j\" (UID: \"8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:51:56.256998 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.256916 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvv66\" (UniqueName: \"kubernetes.io/projected/8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1-kube-api-access-mvv66\") pod \"limitador-operator-controller-manager-85c4996f8c-pd27j\" (UID: \"8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:51:56.327190 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.327146 2569 generic.go:358] "Generic (PLEG): container finished" podID="cc4b59ce-227f-4758-866a-77f205b25dd3" containerID="8e36c34b70d689fa02fe736ac96e7bde49e61682de786b1928e902d7a0237b65" exitCode=0 Apr 22 18:51:56.366288 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.366252 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:51:56.458747 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:56.458689 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:51:57.861623 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:57.861567 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g"] Apr 22 18:51:57.879754 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:57.879702 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g"] Apr 22 18:51:57.879935 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:57.879750 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:57.887310 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:57.887282 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8vdqp\"" Apr 22 18:51:58.040900 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.040865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd82\" (UniqueName: \"kubernetes.io/projected/f4100129-def4-4331-a791-8c4509088fc9-kube-api-access-rvd82\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.041072 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.040959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4100129-def4-4331-a791-8c4509088fc9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.142135 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.142043 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd82\" (UniqueName: \"kubernetes.io/projected/f4100129-def4-4331-a791-8c4509088fc9-kube-api-access-rvd82\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.142135 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.142124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4100129-def4-4331-a791-8c4509088fc9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.142580 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.142553 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f4100129-def4-4331-a791-8c4509088fc9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.153398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.153362 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd82\" (UniqueName: \"kubernetes.io/projected/f4100129-def4-4331-a791-8c4509088fc9-kube-api-access-rvd82\") pod \"kuadrant-operator-controller-manager-55c7f4c975-n6k6g\" (UID: \"f4100129-def4-4331-a791-8c4509088fc9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:51:58.197840 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:51:58.197796 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:52:01.318155 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:01.318109 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57c7686d4d-wtrkk" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerName="console" containerID="cri-o://e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c" gracePeriod=15 Apr 22 18:52:01.378269 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:01.378231 2569 patch_prober.go:28] interesting pod/console-57c7686d4d-wtrkk container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" start-of-body= Apr 22 18:52:01.378464 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:01.378294 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-57c7686d4d-wtrkk" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerName="console" probeResult="failure" output="Get \"https://10.134.0.24:8443/health\": dial tcp 10.134.0.24:8443: connect: connection refused" Apr 22 18:52:11.030531 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.030485 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:52:11.033329 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.033290 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:52:11.167600 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.167569 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbdvd\" (UniqueName: \"kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd\") pod \"cc4b59ce-227f-4758-866a-77f205b25dd3\" (UID: \"cc4b59ce-227f-4758-866a-77f205b25dd3\") " Apr 22 18:52:11.169887 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.169850 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd" (OuterVolumeSpecName: "kube-api-access-sbdvd") pod "cc4b59ce-227f-4758-866a-77f205b25dd3" (UID: "cc4b59ce-227f-4758-866a-77f205b25dd3"). InnerVolumeSpecName "kube-api-access-sbdvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:11.179147 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.179116 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c7686d4d-wtrkk_accf0b03-b9e8-40ed-9e8b-59bab6047583/console/0.log" Apr 22 18:52:11.179271 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.179197 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:52:11.184761 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.183015 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:52:11.186214 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.184908 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j"] Apr 22 18:52:11.203571 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.203541 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g"] Apr 22 18:52:11.204487 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:52:11.204456 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4100129_def4_4331_a791_8c4509088fc9.slice/crio-21cbee23a94ca0acf16c2467945447db9a2a659f34601980561bebcd2334bf1f WatchSource:0}: Error finding container 21cbee23a94ca0acf16c2467945447db9a2a659f34601980561bebcd2334bf1f: Status 404 returned error can't find the container with id 21cbee23a94ca0acf16c2467945447db9a2a659f34601980561bebcd2334bf1f Apr 22 18:52:11.268271 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.268236 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbdvd\" (UniqueName: \"kubernetes.io/projected/cc4b59ce-227f-4758-866a-77f205b25dd3-kube-api-access-sbdvd\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.368768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.368693 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.368944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.368789 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.368944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.368854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.368944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.368893 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvq7f\" (UniqueName: \"kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.369122 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369025 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.369181 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369110 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config" (OuterVolumeSpecName: "console-config") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:52:11.369181 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369146 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.369283 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369201 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert\") pod \"accf0b03-b9e8-40ed-9e8b-59bab6047583\" (UID: \"accf0b03-b9e8-40ed-9e8b-59bab6047583\") " Apr 22 18:52:11.369504 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369479 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.369631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369526 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:52:11.369631 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369582 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca" (OuterVolumeSpecName: "service-ca") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:52:11.369749 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.369624 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:52:11.371358 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.371337 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:11.371704 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.371682 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:52:11.371831 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.371707 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f" (OuterVolumeSpecName: "kube-api-access-bvq7f") pod "accf0b03-b9e8-40ed-9e8b-59bab6047583" (UID: "accf0b03-b9e8-40ed-9e8b-59bab6047583"). InnerVolumeSpecName "kube-api-access-bvq7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:11.399881 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.399841 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" event={"ID":"259d75e6-ca74-4560-aa60-697f8edb7fe1","Type":"ContainerStarted","Data":"06aa0eca701e0b9ccbe3afb3dd51ff0dbaa0215965276133ae03265c275e8223"} Apr 22 18:52:11.401059 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.401027 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" event={"ID":"f4100129-def4-4331-a791-8c4509088fc9","Type":"ContainerStarted","Data":"21cbee23a94ca0acf16c2467945447db9a2a659f34601980561bebcd2334bf1f"} Apr 22 18:52:11.402203 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.402187 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" Apr 22 18:52:11.402285 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.402225 2569 scope.go:117] "RemoveContainer" containerID="8e36c34b70d689fa02fe736ac96e7bde49e61682de786b1928e902d7a0237b65" Apr 22 18:52:11.403925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.403860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" event={"ID":"8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1","Type":"ContainerStarted","Data":"ab77d4b3cd9e293fbe56157f779d491d326aaee9cda8885f066828f48f6a7a56"} Apr 22 18:52:11.403925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.403895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" event={"ID":"8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1","Type":"ContainerStarted","Data":"044e595cfa3d5cf99cef6f61a177a93173742385cd3e75ffd36a8c312e57137a"} Apr 22 18:52:11.404090 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.403979 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:52:11.405307 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.405287 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57c7686d4d-wtrkk_accf0b03-b9e8-40ed-9e8b-59bab6047583/console/0.log" Apr 22 18:52:11.405392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.405331 2569 generic.go:358] "Generic (PLEG): container finished" podID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerID="e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c" exitCode=2 Apr 22 18:52:11.405392 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.405373 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c7686d4d-wtrkk" event={"ID":"accf0b03-b9e8-40ed-9e8b-59bab6047583","Type":"ContainerDied","Data":"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c"} Apr 22 18:52:11.405516 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.405395 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57c7686d4d-wtrkk" event={"ID":"accf0b03-b9e8-40ed-9e8b-59bab6047583","Type":"ContainerDied","Data":"524d9dc916f3756242d366577873bb21ba64f9f118dc266c6006ba97e09acb56"} Apr 22 18:52:11.405516 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.405431 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57c7686d4d-wtrkk" Apr 22 18:52:11.412230 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.412150 2569 scope.go:117] "RemoveContainer" containerID="e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c" Apr 22 18:52:11.421021 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.420996 2569 scope.go:117] "RemoveContainer" containerID="e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c" Apr 22 18:52:11.421372 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:52:11.421346 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c\": container with ID starting with e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c not found: ID does not exist" containerID="e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c" Apr 22 18:52:11.421481 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.421378 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c"} err="failed to get container status \"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c\": rpc error: code = NotFound desc = could not find container \"e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c\": container with ID starting with e4507bea4f7859d69a979fcc1b0833610b0467307dc1a3aca691fd507e893b6c not found: ID does not exist" Apr 22 18:52:11.424141 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.424093 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g8w4q" podStartSLOduration=1.586217422 podStartE2EDuration="26.424078179s" podCreationTimestamp="2026-04-22 18:51:45 +0000 UTC" firstStartedPulling="2026-04-22 18:51:46.267729859 +0000 UTC m=+490.405074253" lastFinishedPulling="2026-04-22 18:52:11.105590603 +0000 UTC m=+515.242935010" observedRunningTime="2026-04-22 18:52:11.421685018 +0000 UTC m=+515.559029427" watchObservedRunningTime="2026-04-22 18:52:11.424078179 +0000 UTC m=+515.561422591" Apr 22 18:52:11.424276 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.424190 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:52:11.451593 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.451544 2569 status_manager.go:895] "Failed to get status for pod" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2zskg" err="pods \"limitador-operator-controller-manager-85c4996f8c-2zskg\" is forbidden: User \"system:node:ip-10-0-132-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-198.ec2.internal' and this object" Apr 22 18:52:11.470105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470070 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-service-ca\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.470105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470102 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-oauth-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.470105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470113 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-serving-cert\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.470419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470123 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/accf0b03-b9e8-40ed-9e8b-59bab6047583-console-oauth-config\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.470419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470132 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvq7f\" (UniqueName: \"kubernetes.io/projected/accf0b03-b9e8-40ed-9e8b-59bab6047583-kube-api-access-bvq7f\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.470419 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.470142 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accf0b03-b9e8-40ed-9e8b-59bab6047583-trusted-ca-bundle\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:52:11.472395 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.472344 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" podStartSLOduration=16.472326072 podStartE2EDuration="16.472326072s" podCreationTimestamp="2026-04-22 18:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:52:11.449279503 +0000 UTC m=+515.586623917" watchObservedRunningTime="2026-04-22 18:52:11.472326072 +0000 UTC m=+515.609670534" Apr 22 18:52:11.474049 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.474029 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:52:11.479890 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:11.479862 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57c7686d4d-wtrkk"] Apr 22 18:52:12.454250 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:12.454213 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" path="/var/lib/kubelet/pods/accf0b03-b9e8-40ed-9e8b-59bab6047583/volumes" Apr 22 18:52:12.454684 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:12.454604 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc4b59ce-227f-4758-866a-77f205b25dd3" path="/var/lib/kubelet/pods/cc4b59ce-227f-4758-866a-77f205b25dd3/volumes" Apr 22 18:52:16.432517 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:16.432483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" event={"ID":"f4100129-def4-4331-a791-8c4509088fc9","Type":"ContainerStarted","Data":"18b7e35a1c89e70d333aa3311cf152019dac6e705790aa2a3cee4c886082d91f"} Apr 22 18:52:16.432956 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:16.432800 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:52:16.461073 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:16.461018 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" podStartSLOduration=14.919477325 podStartE2EDuration="19.461004018s" podCreationTimestamp="2026-04-22 18:51:57 +0000 UTC" firstStartedPulling="2026-04-22 18:52:11.206980784 +0000 UTC m=+515.344325178" lastFinishedPulling="2026-04-22 18:52:15.748507479 +0000 UTC m=+519.885851871" observedRunningTime="2026-04-22 18:52:16.455030374 +0000 UTC m=+520.592374788" watchObservedRunningTime="2026-04-22 18:52:16.461004018 +0000 UTC m=+520.598348431" Apr 22 18:52:22.413207 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:22.413173 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pd27j" Apr 22 18:52:27.439647 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:27.439611 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-n6k6g" Apr 22 18:52:49.539812 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.539766 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:52:49.540323 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.540302 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerName="console" Apr 22 18:52:49.540399 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.540327 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerName="console" Apr 22 18:52:49.540452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.540435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="accf0b03-b9e8-40ed-9e8b-59bab6047583" containerName="console" Apr 22 18:52:49.542482 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.542456 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:52:49.545194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.545174 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-sgc2m\"" Apr 22 18:52:49.551150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.551123 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:52:49.720687 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.720649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxfl\" (UniqueName: \"kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl\") pod \"authorino-7498df8756-tlpb6\" (UID: \"91a3da80-60c7-45d6-95de-94aadb17c1e2\") " pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:52:49.821425 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.821385 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxfl\" (UniqueName: \"kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl\") pod \"authorino-7498df8756-tlpb6\" (UID: \"91a3da80-60c7-45d6-95de-94aadb17c1e2\") " pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:52:49.830468 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.830442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxfl\" (UniqueName: \"kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl\") pod \"authorino-7498df8756-tlpb6\" (UID: \"91a3da80-60c7-45d6-95de-94aadb17c1e2\") " pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:52:49.852960 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.852921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:52:49.995362 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:49.995331 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:52:49.997867 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:52:49.997839 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91a3da80_60c7_45d6_95de_94aadb17c1e2.slice/crio-ae1a72a9285dec9adf4104d1e1e1dcac94be36e311aae27892e7b1a071626ede WatchSource:0}: Error finding container ae1a72a9285dec9adf4104d1e1e1dcac94be36e311aae27892e7b1a071626ede: Status 404 returned error can't find the container with id ae1a72a9285dec9adf4104d1e1e1dcac94be36e311aae27892e7b1a071626ede Apr 22 18:52:50.573860 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:50.573821 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tlpb6" event={"ID":"91a3da80-60c7-45d6-95de-94aadb17c1e2","Type":"ContainerStarted","Data":"ae1a72a9285dec9adf4104d1e1e1dcac94be36e311aae27892e7b1a071626ede"} Apr 22 18:52:53.589698 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:52:53.589659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tlpb6" event={"ID":"91a3da80-60c7-45d6-95de-94aadb17c1e2","Type":"ContainerStarted","Data":"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14"} Apr 22 18:53:19.582640 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.582564 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-tlpb6" podStartSLOduration=27.845850515 podStartE2EDuration="30.582541421s" podCreationTimestamp="2026-04-22 18:52:49 +0000 UTC" firstStartedPulling="2026-04-22 18:52:49.999196005 +0000 UTC m=+554.136540400" lastFinishedPulling="2026-04-22 18:52:52.735886898 +0000 UTC m=+556.873231306" observedRunningTime="2026-04-22 18:52:53.6041767 +0000 UTC m=+557.741521115" watchObservedRunningTime="2026-04-22 18:53:19.582541421 +0000 UTC m=+583.719885835" Apr 22 18:53:19.584353 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.584322 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:53:19.584612 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.584578 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-tlpb6" podUID="91a3da80-60c7-45d6-95de-94aadb17c1e2" containerName="authorino" containerID="cri-o://0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14" gracePeriod=30 Apr 22 18:53:19.829735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.829682 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:53:19.886780 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.886678 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zxfl\" (UniqueName: \"kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl\") pod \"91a3da80-60c7-45d6-95de-94aadb17c1e2\" (UID: \"91a3da80-60c7-45d6-95de-94aadb17c1e2\") " Apr 22 18:53:19.888780 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.888748 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl" (OuterVolumeSpecName: "kube-api-access-7zxfl") pod "91a3da80-60c7-45d6-95de-94aadb17c1e2" (UID: "91a3da80-60c7-45d6-95de-94aadb17c1e2"). InnerVolumeSpecName "kube-api-access-7zxfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:19.988187 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:19.988145 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zxfl\" (UniqueName: \"kubernetes.io/projected/91a3da80-60c7-45d6-95de-94aadb17c1e2-kube-api-access-7zxfl\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:53:20.700432 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.700396 2569 generic.go:358] "Generic (PLEG): container finished" podID="91a3da80-60c7-45d6-95de-94aadb17c1e2" containerID="0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14" exitCode=0 Apr 22 18:53:20.700934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.700449 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tlpb6" Apr 22 18:53:20.700934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.700484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tlpb6" event={"ID":"91a3da80-60c7-45d6-95de-94aadb17c1e2","Type":"ContainerDied","Data":"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14"} Apr 22 18:53:20.700934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.700520 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tlpb6" event={"ID":"91a3da80-60c7-45d6-95de-94aadb17c1e2","Type":"ContainerDied","Data":"ae1a72a9285dec9adf4104d1e1e1dcac94be36e311aae27892e7b1a071626ede"} Apr 22 18:53:20.700934 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.700534 2569 scope.go:117] "RemoveContainer" containerID="0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14" Apr 22 18:53:20.709762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.709735 2569 scope.go:117] "RemoveContainer" containerID="0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14" Apr 22 18:53:20.710060 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:53:20.710039 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14\": container with ID starting with 0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14 not found: ID does not exist" containerID="0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14" Apr 22 18:53:20.710129 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.710074 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14"} err="failed to get container status \"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14\": rpc error: code = NotFound desc = could not find container \"0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14\": container with ID starting with 0f60b7f1d211607700ab7c93ced12a0a1541f3f5e3d6096076830514c935fe14 not found: ID does not exist" Apr 22 18:53:20.716011 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.715952 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:53:20.719368 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:20.719341 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-tlpb6"] Apr 22 18:53:22.453831 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:22.453787 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a3da80-60c7-45d6-95de-94aadb17c1e2" path="/var/lib/kubelet/pods/91a3da80-60c7-45d6-95de-94aadb17c1e2/volumes" Apr 22 18:53:26.717280 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.717237 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:53:26.717827 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.717807 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a3da80-60c7-45d6-95de-94aadb17c1e2" containerName="authorino" Apr 22 18:53:26.717906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.717830 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a3da80-60c7-45d6-95de-94aadb17c1e2" containerName="authorino" Apr 22 18:53:26.717966 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.717937 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a3da80-60c7-45d6-95de-94aadb17c1e2" containerName="authorino" Apr 22 18:53:26.723025 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.723001 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:26.725561 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.725539 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 18:53:26.725766 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.725552 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 18:53:26.732091 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.732058 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:53:26.847628 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.847592 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:26.847828 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.847731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjllp\" (UniqueName: \"kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:26.949188 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.949150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjllp\" (UniqueName: \"kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:26.949372 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.949215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:26.949372 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:53:26.949316 2569 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 22 18:53:26.949448 ip-10-0-132-198 kubenswrapper[2569]: E0422 18:53:26.949390 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls podName:c66689bf-c00d-4198-a769-f1f533e584f0 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:27.449374334 +0000 UTC m=+591.586718730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls") pod "maas-api-6f75cd58bf-224g9" (UID: "c66689bf-c00d-4198-a769-f1f533e584f0") : secret "maas-api-serving-cert" not found Apr 22 18:53:26.957963 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:26.957928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjllp\" (UniqueName: \"kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:27.454569 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:27.454536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:27.456955 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:27.456929 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") pod \"maas-api-6f75cd58bf-224g9\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:27.636700 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:27.636656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:27.772383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:27.772337 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:53:27.773408 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:53:27.773378 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc66689bf_c00d_4198_a769_f1f533e584f0.slice/crio-5a184f576dfb7a7b568eb70f305edfc0e491730202dc802b4268c5a2e5f9b21c WatchSource:0}: Error finding container 5a184f576dfb7a7b568eb70f305edfc0e491730202dc802b4268c5a2e5f9b21c: Status 404 returned error can't find the container with id 5a184f576dfb7a7b568eb70f305edfc0e491730202dc802b4268c5a2e5f9b21c Apr 22 18:53:28.737150 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:28.737106 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6f75cd58bf-224g9" event={"ID":"c66689bf-c00d-4198-a769-f1f533e584f0","Type":"ContainerStarted","Data":"5a184f576dfb7a7b568eb70f305edfc0e491730202dc802b4268c5a2e5f9b21c"} Apr 22 18:53:30.747008 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:30.746968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6f75cd58bf-224g9" event={"ID":"c66689bf-c00d-4198-a769-f1f533e584f0","Type":"ContainerStarted","Data":"f52e64b793e64c9c742d0df65d7da03ba4e35e047bf8732fce50dd9a54a3d0cc"} Apr 22 18:53:30.747460 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:30.747086 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:53:30.769307 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:30.769259 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6f75cd58bf-224g9" podStartSLOduration=2.66309874 podStartE2EDuration="4.769244161s" podCreationTimestamp="2026-04-22 18:53:26 +0000 UTC" firstStartedPulling="2026-04-22 18:53:27.774754602 +0000 UTC m=+591.912098996" lastFinishedPulling="2026-04-22 18:53:29.880900023 +0000 UTC m=+594.018244417" observedRunningTime="2026-04-22 18:53:30.766494624 +0000 UTC m=+594.903839058" watchObservedRunningTime="2026-04-22 18:53:30.769244161 +0000 UTC m=+594.906588624" Apr 22 18:53:36.398553 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:36.398528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:53:36.398970 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:36.398527 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:53:36.404690 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:36.404669 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:53:36.404690 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:36.404669 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:53:36.757114 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:53:36.757039 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:54:05.753017 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:05.752923 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:54:05.753649 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:05.753270 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6f75cd58bf-224g9" podUID="c66689bf-c00d-4198-a769-f1f533e584f0" containerName="maas-api" containerID="cri-o://f52e64b793e64c9c742d0df65d7da03ba4e35e047bf8732fce50dd9a54a3d0cc" gracePeriod=30 Apr 22 18:54:05.886538 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:05.886502 2569 generic.go:358] "Generic (PLEG): container finished" podID="c66689bf-c00d-4198-a769-f1f533e584f0" containerID="f52e64b793e64c9c742d0df65d7da03ba4e35e047bf8732fce50dd9a54a3d0cc" exitCode=0 Apr 22 18:54:05.886749 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:05.886574 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6f75cd58bf-224g9" event={"ID":"c66689bf-c00d-4198-a769-f1f533e584f0","Type":"ContainerDied","Data":"f52e64b793e64c9c742d0df65d7da03ba4e35e047bf8732fce50dd9a54a3d0cc"} Apr 22 18:54:06.003403 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.003324 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:54:06.094450 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.094410 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjllp\" (UniqueName: \"kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp\") pod \"c66689bf-c00d-4198-a769-f1f533e584f0\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " Apr 22 18:54:06.094660 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.094578 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") pod \"c66689bf-c00d-4198-a769-f1f533e584f0\" (UID: \"c66689bf-c00d-4198-a769-f1f533e584f0\") " Apr 22 18:54:06.096884 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.096846 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp" (OuterVolumeSpecName: "kube-api-access-hjllp") pod "c66689bf-c00d-4198-a769-f1f533e584f0" (UID: "c66689bf-c00d-4198-a769-f1f533e584f0"). InnerVolumeSpecName "kube-api-access-hjllp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:06.096884 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.096846 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "c66689bf-c00d-4198-a769-f1f533e584f0" (UID: "c66689bf-c00d-4198-a769-f1f533e584f0"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:54:06.195609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.195574 2569 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c66689bf-c00d-4198-a769-f1f533e584f0-maas-api-tls\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:54:06.195609 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.195605 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjllp\" (UniqueName: \"kubernetes.io/projected/c66689bf-c00d-4198-a769-f1f533e584f0-kube-api-access-hjllp\") on node \"ip-10-0-132-198.ec2.internal\" DevicePath \"\"" Apr 22 18:54:06.895022 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.894978 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6f75cd58bf-224g9" event={"ID":"c66689bf-c00d-4198-a769-f1f533e584f0","Type":"ContainerDied","Data":"5a184f576dfb7a7b568eb70f305edfc0e491730202dc802b4268c5a2e5f9b21c"} Apr 22 18:54:06.895463 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.895036 2569 scope.go:117] "RemoveContainer" containerID="f52e64b793e64c9c742d0df65d7da03ba4e35e047bf8732fce50dd9a54a3d0cc" Apr 22 18:54:06.895463 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.894998 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6f75cd58bf-224g9" Apr 22 18:54:06.916040 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.916006 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:54:06.920679 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:06.920647 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6f75cd58bf-224g9"] Apr 22 18:54:08.453740 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:08.453691 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66689bf-c00d-4198-a769-f1f533e584f0" path="/var/lib/kubelet/pods/c66689bf-c00d-4198-a769-f1f533e584f0/volumes" Apr 22 18:54:09.527054 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.527019 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c"] Apr 22 18:54:09.527432 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.527378 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c66689bf-c00d-4198-a769-f1f533e584f0" containerName="maas-api" Apr 22 18:54:09.527432 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.527388 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66689bf-c00d-4198-a769-f1f533e584f0" containerName="maas-api" Apr 22 18:54:09.527502 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.527455 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c66689bf-c00d-4198-a769-f1f533e584f0" containerName="maas-api" Apr 22 18:54:09.531046 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.531026 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.533975 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.533942 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 18:54:09.534950 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.534937 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gxmms\"" Apr 22 18:54:09.535264 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.535242 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 18:54:09.535325 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.535310 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 18:54:09.551515 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.551478 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c"] Apr 22 18:54:09.629627 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7bj\" (UniqueName: \"kubernetes.io/projected/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kube-api-access-gc7bj\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.629845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629640 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.629845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629701 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.629845 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629821 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/330905a2-82a2-4cfc-8cb4-35b50c022ca3-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.629993 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.629993 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.629956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731008 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.730965 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731199 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731024 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/330905a2-82a2-4cfc-8cb4-35b50c022ca3-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731199 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731044 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731199 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731077 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731199 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731107 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7bj\" (UniqueName: \"kubernetes.io/projected/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kube-api-access-gc7bj\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731199 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731490 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731449 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731582 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.731582 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.731555 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.733470 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.733441 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/330905a2-82a2-4cfc-8cb4-35b50c022ca3-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.733784 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.733766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/330905a2-82a2-4cfc-8cb4-35b50c022ca3-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.739738 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.739696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7bj\" (UniqueName: \"kubernetes.io/projected/330905a2-82a2-4cfc-8cb4-35b50c022ca3-kube-api-access-gc7bj\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wnz4c\" (UID: \"330905a2-82a2-4cfc-8cb4-35b50c022ca3\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.841776 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.841742 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:09.980907 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.980880 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c"] Apr 22 18:54:09.982748 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:54:09.982693 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330905a2_82a2_4cfc_8cb4_35b50c022ca3.slice/crio-3ae2e1ec374d17c32f8018aa80fe339cc571216f343ba2c61203a6ebeb78b698 WatchSource:0}: Error finding container 3ae2e1ec374d17c32f8018aa80fe339cc571216f343ba2c61203a6ebeb78b698: Status 404 returned error can't find the container with id 3ae2e1ec374d17c32f8018aa80fe339cc571216f343ba2c61203a6ebeb78b698 Apr 22 18:54:09.984475 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:09.984459 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:54:10.915025 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:10.914989 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" event={"ID":"330905a2-82a2-4cfc-8cb4-35b50c022ca3","Type":"ContainerStarted","Data":"3ae2e1ec374d17c32f8018aa80fe339cc571216f343ba2c61203a6ebeb78b698"} Apr 22 18:54:15.938870 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:15.938829 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" event={"ID":"330905a2-82a2-4cfc-8cb4-35b50c022ca3","Type":"ContainerStarted","Data":"65bf204a551cfbea33a9d53f57564303f326c2bb9c3a92b67dd6ce63f6c0f973"} Apr 22 18:54:20.964797 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:20.964761 2569 generic.go:358] "Generic (PLEG): container finished" podID="330905a2-82a2-4cfc-8cb4-35b50c022ca3" containerID="65bf204a551cfbea33a9d53f57564303f326c2bb9c3a92b67dd6ce63f6c0f973" exitCode=0 Apr 22 18:54:20.965335 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:20.964842 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" event={"ID":"330905a2-82a2-4cfc-8cb4-35b50c022ca3","Type":"ContainerDied","Data":"65bf204a551cfbea33a9d53f57564303f326c2bb9c3a92b67dd6ce63f6c0f973"} Apr 22 18:54:22.975599 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:22.975561 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" event={"ID":"330905a2-82a2-4cfc-8cb4-35b50c022ca3","Type":"ContainerStarted","Data":"f3c296cef49b0da05bd222470d75071d2b06598449d7566b1b5c372751255c32"} Apr 22 18:54:22.975970 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:22.975860 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:22.995916 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:22.995853 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" podStartSLOduration=1.879021896 podStartE2EDuration="13.99583155s" podCreationTimestamp="2026-04-22 18:54:09 +0000 UTC" firstStartedPulling="2026-04-22 18:54:09.984584449 +0000 UTC m=+634.121928839" lastFinishedPulling="2026-04-22 18:54:22.101394093 +0000 UTC m=+646.238738493" observedRunningTime="2026-04-22 18:54:22.995159853 +0000 UTC m=+647.132504267" watchObservedRunningTime="2026-04-22 18:54:22.99583155 +0000 UTC m=+647.133175968" Apr 22 18:54:33.992806 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:33.992772 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wnz4c" Apr 22 18:54:47.162919 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.162886 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl"] Apr 22 18:54:47.201497 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.201452 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl"] Apr 22 18:54:47.201672 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.201618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.204894 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.204870 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 18:54:47.272166 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.272356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272188 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.272356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272211 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmvk\" (UniqueName: \"kubernetes.io/projected/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kube-api-access-hkmvk\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.272356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.272356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272267 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.272356 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.272301 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373177 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373177 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373181 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmvk\" (UniqueName: \"kubernetes.io/projected/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kube-api-access-hkmvk\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373434 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373204 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373434 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373230 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373530 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373433 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373574 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373814 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373923 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.373988 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.373959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.375653 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.375631 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.375786 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.375770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.382957 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.382932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmvk\" (UniqueName: \"kubernetes.io/projected/51e85ccb-1a1e-4a5a-92ef-2b98dde465ce-kube-api-access-hkmvk\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl\" (UID: \"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.512819 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.512705 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:47.660516 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:47.660474 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl"] Apr 22 18:54:47.663163 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:54:47.663128 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e85ccb_1a1e_4a5a_92ef_2b98dde465ce.slice/crio-3d5d51481da4a0939fc81063bab7f86d6de5e11cfa13f710567e4db4a9b28b5b WatchSource:0}: Error finding container 3d5d51481da4a0939fc81063bab7f86d6de5e11cfa13f710567e4db4a9b28b5b: Status 404 returned error can't find the container with id 3d5d51481da4a0939fc81063bab7f86d6de5e11cfa13f710567e4db4a9b28b5b Apr 22 18:54:48.075701 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:48.075651 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" event={"ID":"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce","Type":"ContainerStarted","Data":"c7fcfa4da264c416e0ba9ef10f9fd6e19bf76abc7ea8fa474eb1dddb31618c41"} Apr 22 18:54:48.075701 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:48.075696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" event={"ID":"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce","Type":"ContainerStarted","Data":"3d5d51481da4a0939fc81063bab7f86d6de5e11cfa13f710567e4db4a9b28b5b"} Apr 22 18:54:54.104658 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.104622 2569 generic.go:358] "Generic (PLEG): container finished" podID="51e85ccb-1a1e-4a5a-92ef-2b98dde465ce" containerID="c7fcfa4da264c416e0ba9ef10f9fd6e19bf76abc7ea8fa474eb1dddb31618c41" exitCode=0 Apr 22 18:54:54.104658 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.104666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" event={"ID":"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce","Type":"ContainerDied","Data":"c7fcfa4da264c416e0ba9ef10f9fd6e19bf76abc7ea8fa474eb1dddb31618c41"} Apr 22 18:54:54.851865 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.851825 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j"] Apr 22 18:54:54.855635 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.855607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.858131 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.858108 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 18:54:54.864989 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.864958 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j"] Apr 22 18:54:54.943210 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943171 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.943405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a140885-cd59-4ab9-9ef2-5a32e539c350-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.943405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943275 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.943405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943310 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.943405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:54.943405 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:54.943373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrrr\" (UniqueName: \"kubernetes.io/projected/9a140885-cd59-4ab9-9ef2-5a32e539c350-kube-api-access-vnrrr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044471 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044430 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a140885-cd59-4ab9-9ef2-5a32e539c350-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.044735 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.044646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrrr\" (UniqueName: \"kubernetes.io/projected/9a140885-cd59-4ab9-9ef2-5a32e539c350-kube-api-access-vnrrr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.045304 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.045277 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.045497 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.045466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.045588 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.045471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.047594 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.047569 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a140885-cd59-4ab9-9ef2-5a32e539c350-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.047730 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.047609 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a140885-cd59-4ab9-9ef2-5a32e539c350-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.054854 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.054826 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrrr\" (UniqueName: \"kubernetes.io/projected/9a140885-cd59-4ab9-9ef2-5a32e539c350-kube-api-access-vnrrr\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j\" (UID: \"9a140885-cd59-4ab9-9ef2-5a32e539c350\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.114828 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.114734 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" event={"ID":"51e85ccb-1a1e-4a5a-92ef-2b98dde465ce","Type":"ContainerStarted","Data":"498597d9432ee9b9e88267105e36d32db415122badba6dbfbeaa8823c5fcb6e2"} Apr 22 18:54:55.115177 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.114958 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:54:55.136194 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.136139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" podStartSLOduration=7.881004257 podStartE2EDuration="8.136124343s" podCreationTimestamp="2026-04-22 18:54:47 +0000 UTC" firstStartedPulling="2026-04-22 18:54:54.105329568 +0000 UTC m=+678.242673959" lastFinishedPulling="2026-04-22 18:54:54.360449654 +0000 UTC m=+678.497794045" observedRunningTime="2026-04-22 18:54:55.13541778 +0000 UTC m=+679.272762189" watchObservedRunningTime="2026-04-22 18:54:55.136124343 +0000 UTC m=+679.273468755" Apr 22 18:54:55.166897 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.166856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:54:55.305388 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:55.305354 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j"] Apr 22 18:54:55.306748 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:54:55.306701 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a140885_cd59_4ab9_9ef2_5a32e539c350.slice/crio-870797e929798637a231ae1aff6ce505bcfbf347ad45bd54b459237c1d61a169 WatchSource:0}: Error finding container 870797e929798637a231ae1aff6ce505bcfbf347ad45bd54b459237c1d61a169: Status 404 returned error can't find the container with id 870797e929798637a231ae1aff6ce505bcfbf347ad45bd54b459237c1d61a169 Apr 22 18:54:56.127365 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:56.127319 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" event={"ID":"9a140885-cd59-4ab9-9ef2-5a32e539c350","Type":"ContainerStarted","Data":"bac812881ef54316d3f34e01a16a937e07c30fb26026bb0688b401fcb988a17b"} Apr 22 18:54:56.127365 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:54:56.127368 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" event={"ID":"9a140885-cd59-4ab9-9ef2-5a32e539c350","Type":"ContainerStarted","Data":"870797e929798637a231ae1aff6ce505bcfbf347ad45bd54b459237c1d61a169"} Apr 22 18:55:02.154664 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:02.154624 2569 generic.go:358] "Generic (PLEG): container finished" podID="9a140885-cd59-4ab9-9ef2-5a32e539c350" containerID="bac812881ef54316d3f34e01a16a937e07c30fb26026bb0688b401fcb988a17b" exitCode=0 Apr 22 18:55:02.155105 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:02.154703 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" event={"ID":"9a140885-cd59-4ab9-9ef2-5a32e539c350","Type":"ContainerDied","Data":"bac812881ef54316d3f34e01a16a937e07c30fb26026bb0688b401fcb988a17b"} Apr 22 18:55:03.160937 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:03.160899 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" event={"ID":"9a140885-cd59-4ab9-9ef2-5a32e539c350","Type":"ContainerStarted","Data":"3a788560560a6b5dc715faa5aae7b8d71bf45f8f3824b2c0376c3187ef39ba11"} Apr 22 18:55:03.161410 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:03.161142 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:55:03.187022 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:03.186969 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" podStartSLOduration=8.852610644 podStartE2EDuration="9.186953374s" podCreationTimestamp="2026-04-22 18:54:54 +0000 UTC" firstStartedPulling="2026-04-22 18:55:02.155604286 +0000 UTC m=+686.292948681" lastFinishedPulling="2026-04-22 18:55:02.489947016 +0000 UTC m=+686.627291411" observedRunningTime="2026-04-22 18:55:03.184435385 +0000 UTC m=+687.321779799" watchObservedRunningTime="2026-04-22 18:55:03.186953374 +0000 UTC m=+687.324297787" Apr 22 18:55:06.141696 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:06.141663 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl" Apr 22 18:55:14.178393 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:55:14.178357 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j" Apr 22 18:57:04.716142 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:04.716057 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t_67630851-6043-4d26-9507-020c13e93bc6/manager/0.log" Apr 22 18:57:05.780047 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.780016 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/extract/0.log" Apr 22 18:57:05.786418 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.786389 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/util/0.log" Apr 22 18:57:05.793558 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.793528 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/pull/0.log" Apr 22 18:57:05.903866 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.903834 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/util/0.log" Apr 22 18:57:05.909944 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.909906 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/pull/0.log" Apr 22 18:57:05.915925 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:05.915902 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/extract/0.log" Apr 22 18:57:06.025022 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.024996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/util/0.log" Apr 22 18:57:06.031536 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.031445 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/pull/0.log" Apr 22 18:57:06.042030 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.041996 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/extract/0.log" Apr 22 18:57:06.145621 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.145593 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/util/0.log" Apr 22 18:57:06.151795 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.151770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/pull/0.log" Apr 22 18:57:06.157664 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.157639 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/extract/0.log" Apr 22 18:57:06.491992 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.491962 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-xklhn_d0181e7b-79bc-480d-9390-de810fdd12d5/manager/0.log" Apr 22 18:57:06.599788 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.599759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-g8w4q_259d75e6-ca74-4560-aa60-697f8edb7fe1/kuadrant-console-plugin/0.log" Apr 22 18:57:06.714768 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.714737 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nx5cw_528d1f89-18bc-4e58-a2e9-6ccc58896e18/registry-server/0.log" Apr 22 18:57:06.834874 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:06.834820 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-n6k6g_f4100129-def4-4331-a791-8c4509088fc9/manager/0.log" Apr 22 18:57:07.056207 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:07.056175 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-pd27j_8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1/manager/0.log" Apr 22 18:57:07.391256 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:07.391224 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8_e111bc77-1baa-4309-845c-d1ee770f7b9d/istio-proxy/0.log" Apr 22 18:57:08.264097 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.264062 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl_51e85ccb-1a1e-4a5a-92ef-2b98dde465ce/storage-initializer/0.log" Apr 22 18:57:08.271143 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.271116 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-2zwtl_51e85ccb-1a1e-4a5a-92ef-2b98dde465ce/main/0.log" Apr 22 18:57:08.498920 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.498887 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wnz4c_330905a2-82a2-4cfc-8cb4-35b50c022ca3/storage-initializer/0.log" Apr 22 18:57:08.507022 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.506998 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wnz4c_330905a2-82a2-4cfc-8cb4-35b50c022ca3/main/0.log" Apr 22 18:57:08.726675 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.726643 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j_9a140885-cd59-4ab9-9ef2-5a32e539c350/storage-initializer/0.log" Apr 22 18:57:08.737690 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:08.737667 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-78g6j_9a140885-cd59-4ab9-9ef2-5a32e539c350/main/0.log" Apr 22 18:57:15.886732 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:15.886679 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z5sdb_a8ed4379-675b-4bd5-92be-4ce4a4aa88a7/global-pull-secret-syncer/0.log" Apr 22 18:57:15.933806 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:15.933774 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7c2jq_baf1611e-6531-40fa-9e20-fab0967f4575/konnectivity-agent/0.log" Apr 22 18:57:16.006726 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:16.006675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-198.ec2.internal_d715d14c3e64c4935a99ec8a0e8eebfa/haproxy/0.log" Apr 22 18:57:19.929315 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:19.929275 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/extract/0.log" Apr 22 18:57:19.953198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:19.953167 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/util/0.log" Apr 22 18:57:19.979144 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:19.979114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759g9rvf_8ecfaa96-6ca2-4448-8418-f8c06647d59a/pull/0.log" Apr 22 18:57:20.009068 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.009040 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/extract/0.log" Apr 22 18:57:20.033240 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.033207 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/util/0.log" Apr 22 18:57:20.060974 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.060936 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crd8v_218a6a25-5455-467f-8b36-2e3a9004d2f9/pull/0.log" Apr 22 18:57:20.091654 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.091615 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/extract/0.log" Apr 22 18:57:20.114796 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.114765 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/util/0.log" Apr 22 18:57:20.138484 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.138458 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73pbfnz_32c07195-42f6-418b-bcdd-48517730c21b/pull/0.log" Apr 22 18:57:20.171850 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.171824 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/extract/0.log" Apr 22 18:57:20.193753 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.193651 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/util/0.log" Apr 22 18:57:20.215230 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.215203 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kgskd_a6693871-180e-4097-983c-c6bb8688304b/pull/0.log" Apr 22 18:57:20.301895 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.301863 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-xklhn_d0181e7b-79bc-480d-9390-de810fdd12d5/manager/0.log" Apr 22 18:57:20.328725 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.328681 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-g8w4q_259d75e6-ca74-4560-aa60-697f8edb7fe1/kuadrant-console-plugin/0.log" Apr 22 18:57:20.360530 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.360496 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nx5cw_528d1f89-18bc-4e58-a2e9-6ccc58896e18/registry-server/0.log" Apr 22 18:57:20.412383 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.412354 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-n6k6g_f4100129-def4-4331-a791-8c4509088fc9/manager/0.log" Apr 22 18:57:20.475669 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:20.475587 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-pd27j_8a3194f7-48f9-45ef-bdb9-fbdb6bfb3ae1/manager/0.log" Apr 22 18:57:22.057454 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.057428 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-hkd4b_cba8d754-67a7-482c-b98c-dcd5f2072415/cluster-monitoring-operator/0.log" Apr 22 18:57:22.086442 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.086412 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cr2rh_f0ea930f-4df9-4710-8bf2-32e6e3d0bf39/kube-state-metrics/0.log" Apr 22 18:57:22.113731 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.113675 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cr2rh_f0ea930f-4df9-4710-8bf2-32e6e3d0bf39/kube-rbac-proxy-main/0.log" Apr 22 18:57:22.138297 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.138269 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-cr2rh_f0ea930f-4df9-4710-8bf2-32e6e3d0bf39/kube-rbac-proxy-self/0.log" Apr 22 18:57:22.221193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.221091 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9stw7_6d348323-659c-4564-a8aa-bc974d5ade37/node-exporter/0.log" Apr 22 18:57:22.247634 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.247594 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9stw7_6d348323-659c-4564-a8aa-bc974d5ade37/kube-rbac-proxy/0.log" Apr 22 18:57:22.268682 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.268652 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9stw7_6d348323-659c-4564-a8aa-bc974d5ade37/init-textfile/0.log" Apr 22 18:57:22.459036 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.459000 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79pxt_6494b798-d8d1-443d-b970-89c2f4b791cc/kube-rbac-proxy-main/0.log" Apr 22 18:57:22.479777 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.479677 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79pxt_6494b798-d8d1-443d-b970-89c2f4b791cc/kube-rbac-proxy-self/0.log" Apr 22 18:57:22.507434 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.507402 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-79pxt_6494b798-d8d1-443d-b970-89c2f4b791cc/openshift-state-metrics/0.log" Apr 22 18:57:22.704198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.704165 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kdk7b_80a43513-33b5-4856-9cd9-8773d592e73a/prometheus-operator/0.log" Apr 22 18:57:22.722131 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.722100 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kdk7b_80a43513-33b5-4856-9cd9-8773d592e73a/kube-rbac-proxy/0.log" Apr 22 18:57:22.746573 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:22.746488 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bppcn_52424655-8dfa-4bcb-8b83-ccbf0a6ff7c9/prometheus-operator-admission-webhook/0.log" Apr 22 18:57:24.555532 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.555505 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/2.log" Apr 22 18:57:24.556979 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.556942 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz"] Apr 22 18:57:24.561193 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.561171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.562452 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.562428 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-whxx8_f6ed56c5-616f-4090-b07c-6ea41e001ffb/console-operator/3.log" Apr 22 18:57:24.563827 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.563807 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"kube-root-ca.crt\"" Apr 22 18:57:24.565391 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.565367 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"openshift-service-ca.crt\"" Apr 22 18:57:24.565529 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.565370 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nb747\"/\"default-dockercfg-hb8wb\"" Apr 22 18:57:24.568139 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.568119 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz"] Apr 22 18:57:24.609548 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.609517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-lib-modules\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.609776 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.609567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-proc\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.609776 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.609664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-podres\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.609776 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.609742 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76xlj\" (UniqueName: \"kubernetes.io/projected/18a456ba-bc22-4bc7-8ea3-83864af3297d-kube-api-access-76xlj\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.609926 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.609776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-sys\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710507 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-lib-modules\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-proc\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710572 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-podres\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710600 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76xlj\" (UniqueName: \"kubernetes.io/projected/18a456ba-bc22-4bc7-8ea3-83864af3297d-kube-api-access-76xlj\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710629 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-sys\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-lib-modules\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-proc\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.710762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710705 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-podres\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.711045 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.710768 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/18a456ba-bc22-4bc7-8ea3-83864af3297d-sys\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.734742 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.734682 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76xlj\" (UniqueName: \"kubernetes.io/projected/18a456ba-bc22-4bc7-8ea3-83864af3297d-kube-api-access-76xlj\") pod \"perf-node-gather-daemonset-6s7tz\" (UID: \"18a456ba-bc22-4bc7-8ea3-83864af3297d\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:24.872967 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:24.872933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:25.007528 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.007501 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz"] Apr 22 18:57:25.008776 ip-10-0-132-198 kubenswrapper[2569]: W0422 18:57:25.008745 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18a456ba_bc22_4bc7_8ea3_83864af3297d.slice/crio-1c05d73e06048697cb6c6e0fa94856f04b5c7c93582a76b25ea006397b36dfaf WatchSource:0}: Error finding container 1c05d73e06048697cb6c6e0fa94856f04b5c7c93582a76b25ea006397b36dfaf: Status 404 returned error can't find the container with id 1c05d73e06048697cb6c6e0fa94856f04b5c7c93582a76b25ea006397b36dfaf Apr 22 18:57:25.173762 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.173733 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68bdc9c74-l88mg_6d16941a-798d-4fcf-8cd5-9fac915ba9c9/console/0.log" Apr 22 18:57:25.219079 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.219048 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4mw78_37e5a887-0921-4401-b900-dc37b5b8a892/download-server/0.log" Apr 22 18:57:25.725489 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.725457 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zgbjn_ce444571-7bf0-4e63-9242-ba9ff25b7c8b/volume-data-source-validator/0.log" Apr 22 18:57:25.759498 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.759463 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" event={"ID":"18a456ba-bc22-4bc7-8ea3-83864af3297d","Type":"ContainerStarted","Data":"7d1c56fc68e3885ea8e966c375cb416c858d98ca06505637dfbf4ff78ae35ff4"} Apr 22 18:57:25.759498 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.759502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" event={"ID":"18a456ba-bc22-4bc7-8ea3-83864af3297d","Type":"ContainerStarted","Data":"1c05d73e06048697cb6c6e0fa94856f04b5c7c93582a76b25ea006397b36dfaf"} Apr 22 18:57:25.759741 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.759556 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:25.778409 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:25.778353 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" podStartSLOduration=1.778339611 podStartE2EDuration="1.778339611s" podCreationTimestamp="2026-04-22 18:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:25.774544141 +0000 UTC m=+829.911888551" watchObservedRunningTime="2026-04-22 18:57:25.778339611 +0000 UTC m=+829.915684023" Apr 22 18:57:26.578178 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:26.578151 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mcc6f_f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf/dns/0.log" Apr 22 18:57:26.598641 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:26.598613 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mcc6f_f0952b9b-e2e6-4d5b-872a-d6b8ba86bbdf/kube-rbac-proxy/0.log" Apr 22 18:57:26.664198 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:26.664172 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2dk4j_f3cf8a0e-7d54-44c9-952a-7acd6af68bc4/dns-node-resolver/0.log" Apr 22 18:57:27.184870 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:27.184833 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-c58665cd8-ln2nc_37da8efb-41bd-482a-858a-29a00dd98073/registry/0.log" Apr 22 18:57:27.250313 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:27.250285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vckqs_1d694bc1-10aa-4951-95ba-8ec4878fea46/node-ca/0.log" Apr 22 18:57:28.067263 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:28.067231 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffm9m8_e111bc77-1baa-4309-845c-d1ee770f7b9d/istio-proxy/0.log" Apr 22 18:57:28.702731 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:28.702662 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8xrxz_c49b1678-0299-4b9e-a911-9a8cf6fedf32/serve-healthcheck-canary/0.log" Apr 22 18:57:29.352637 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:29.352585 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nr9dg_0970c196-c954-48e1-a580-5cdba184f826/kube-rbac-proxy/0.log" Apr 22 18:57:29.372696 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:29.372643 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nr9dg_0970c196-c954-48e1-a580-5cdba184f826/exporter/0.log" Apr 22 18:57:29.396188 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:29.396161 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nr9dg_0970c196-c954-48e1-a580-5cdba184f826/extractor/0.log" Apr 22 18:57:31.365337 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:31.365279 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-g2k6t_67630851-6043-4d26-9507-020c13e93bc6/manager/0.log" Apr 22 18:57:31.773608 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:31.773531 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-6s7tz" Apr 22 18:57:32.488864 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:32.488836 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59544c8f7-bbmw5_bb3fe25a-e9a2-48dd-b291-777a670e942d/manager/0.log" Apr 22 18:57:37.489948 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:37.489916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6fj2l_da28986a-53c5-4e28-9290-9ebf6cffc2ae/kube-storage-version-migrator-operator/1.log" Apr 22 18:57:37.491490 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:37.491463 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6fj2l_da28986a-53c5-4e28-9290-9ebf6cffc2ae/kube-storage-version-migrator-operator/0.log" Apr 22 18:57:38.477095 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.477058 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/kube-multus-additional-cni-plugins/0.log" Apr 22 18:57:38.500623 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.500597 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/egress-router-binary-copy/0.log" Apr 22 18:57:38.521433 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.521409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/cni-plugins/0.log" Apr 22 18:57:38.544301 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.544273 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/bond-cni-plugin/0.log" Apr 22 18:57:38.567144 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.567114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/routeoverride-cni/0.log" Apr 22 18:57:38.587992 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.587964 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/whereabouts-cni-bincopy/0.log" Apr 22 18:57:38.609665 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:38.609637 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-27fs7_100cdeaa-300c-4323-8dd0-f6d9ca2702af/whereabouts-cni/0.log" Apr 22 18:57:39.021693 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:39.021660 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w4nw5_8cd178ba-71e9-476e-b2e8-e4a4f6ad3f97/kube-multus/0.log" Apr 22 18:57:39.092809 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:39.092781 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-srr7v_0d277017-e970-494a-84b8-a3c2be9bbf85/network-metrics-daemon/0.log" Apr 22 18:57:39.115906 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:39.115878 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-srr7v_0d277017-e970-494a-84b8-a3c2be9bbf85/kube-rbac-proxy/0.log" Apr 22 18:57:40.468446 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.468418 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-controller/0.log" Apr 22 18:57:40.485415 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.485386 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/0.log" Apr 22 18:57:40.492977 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.492937 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovn-acl-logging/1.log" Apr 22 18:57:40.515037 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.515012 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/kube-rbac-proxy-node/0.log" Apr 22 18:57:40.536527 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.536482 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:57:40.554678 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.554650 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/northd/0.log" Apr 22 18:57:40.583705 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.583667 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/nbdb/0.log" Apr 22 18:57:40.606398 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.606360 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/sbdb/0.log" Apr 22 18:57:40.773241 ip-10-0-132-198 kubenswrapper[2569]: I0422 18:57:40.773146 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zlsv5_1a4a7ddf-880f-4094-a270-f8e491751768/ovnkube-controller/0.log"