Apr 19 12:07:13.484942 ip-10-0-140-225 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 19 12:07:13.484953 ip-10-0-140-225 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 19 12:07:13.484960 ip-10-0-140-225 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 19 12:07:13.485202 ip-10-0-140-225 systemd[1]: Failed to start Kubernetes Kubelet. Apr 19 12:07:23.597913 ip-10-0-140-225 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 19 12:07:23.597930 ip-10-0-140-225 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8207ff8676954a03a5ed7ddaabe72ef5 -- Apr 19 12:09:41.154036 ip-10-0-140-225 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:09:41.607710 ip-10-0-140-225 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:41.607710 ip-10-0-140-225 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:09:41.607710 ip-10-0-140-225 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:41.607710 ip-10-0-140-225 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:09:41.607710 ip-10-0-140-225 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:41.610573 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.610487 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:09:41.615290 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615266 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:41.615290 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615287 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:41.615290 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615292 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:41.615290 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615296 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615299 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615303 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615306 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615310 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615313 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615316 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615319 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615322 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615324 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615330 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615333 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615336 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615338 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615341 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615344 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615347 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615349 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615352 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:41.615452 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615354 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615357 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615360 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615365 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615368 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615372 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615374 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615377 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615380 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615382 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615385 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615387 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615390 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615393 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615395 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615398 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615403 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615406 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615409 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615412 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:41.615955 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615415 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615418 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615420 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615423 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615425 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615428 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615431 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615433 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615438 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615441 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615444 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615446 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615449 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615451 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615457 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615461 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615466 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615469 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615472 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:41.616445 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615477 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615480 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615483 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615485 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615488 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615491 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615493 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615496 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615499 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615502 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615504 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615507 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615510 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615515 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615518 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615520 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615523 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615529 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615532 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615534 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:41.616937 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615537 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615539 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615542 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615545 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.615547 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616158 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616164 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616167 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616169 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616172 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616178 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616181 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616183 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616186 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616189 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616192 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616194 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616197 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616200 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616203 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:41.617409 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616206 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616209 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616214 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616217 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616220 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616223 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616226 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616228 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616231 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616234 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616236 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616239 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616242 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616244 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616247 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616252 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616255 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616257 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616260 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616263 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:41.617911 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616265 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616270 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616274 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616277 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616280 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616282 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616288 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616290 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616293 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616296 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616298 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616302 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616304 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616307 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616310 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616312 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616315 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616318 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616321 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:41.618438 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616326 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616328 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616331 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616334 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616336 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616339 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616341 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616344 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616346 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616351 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616354 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616357 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616363 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616366 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616369 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616371 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616374 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616377 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616379 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616382 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:41.618925 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616385 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616387 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616390 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616392 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616399 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616401 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616404 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616407 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616410 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616412 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616415 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.616418 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.617986 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618004 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618012 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618017 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618021 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618025 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618030 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618034 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618038 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:09:41.619407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618041 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618045 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618048 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618051 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618054 2568 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618057 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618060 2568 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618063 2568 flags.go:64] FLAG: --cloud-config="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618066 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618069 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618074 2568 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618076 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618080 2568 flags.go:64] FLAG: --config-dir="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618082 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618086 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618090 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618093 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618096 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618099 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618102 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618105 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618108 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618111 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618115 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618121 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:09:41.619940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618124 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618127 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618130 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618134 2568 flags.go:64] FLAG: --enable-server="true" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618137 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618142 2568 flags.go:64] FLAG: --event-burst="100" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618145 2568 flags.go:64] FLAG: --event-qps="50" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618148 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618151 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618154 2568 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618158 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618161 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618164 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618166 2568 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618169 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618172 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618175 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618178 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618181 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618184 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618187 2568 flags.go:64] FLAG: --feature-gates="" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618191 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618194 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618197 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618200 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618204 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:09:41.620547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618207 2568 flags.go:64] FLAG: --help="false" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618210 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-140-225.ec2.internal" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618213 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618216 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618219 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618224 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618228 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618231 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618234 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618238 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618241 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618244 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618247 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618250 2568 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618253 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618255 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618259 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618261 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618265 2568 flags.go:64] FLAG: --lock-file="" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618267 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618270 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618274 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618279 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:09:41.621188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618282 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618285 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618289 2568 flags.go:64] FLAG: --logging-format="text" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618292 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618295 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618298 2568 flags.go:64] FLAG: --manifest-url="" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618301 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618306 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618309 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618313 2568 flags.go:64] FLAG: --max-pods="110" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618316 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618319 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618323 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618326 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618330 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618333 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618336 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618344 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618347 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618350 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618355 2568 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618358 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618363 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618366 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:09:41.621759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618369 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618372 2568 flags.go:64] FLAG: --port="10250" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618375 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618377 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-088d9a497b411fa67" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618381 2568 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618384 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618387 2568 flags.go:64] FLAG: --register-node="true" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618390 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618393 2568 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618396 2568 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618399 2568 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618402 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618405 2568 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618408 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618412 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618415 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618418 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618421 2568 flags.go:64] FLAG: --runonce="false" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618424 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618427 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618430 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618433 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618436 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618439 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618443 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618446 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:09:41.622339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618449 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618452 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618455 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618458 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618461 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618464 2568 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618467 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618482 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618485 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618488 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618492 2568 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618495 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618498 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618501 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618503 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618506 2568 flags.go:64] FLAG: --v="2" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618510 2568 flags.go:64] FLAG: --version="false" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618514 2568 flags.go:64] FLAG: --vmodule="" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618526 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.618529 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618639 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618643 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618647 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618650 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:41.622972 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618653 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618655 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618658 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618661 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618664 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618667 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618669 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618672 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618675 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618677 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618680 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618683 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618686 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618689 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618691 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618694 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618696 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618699 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618701 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618704 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:41.623587 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618706 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618708 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618711 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618713 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618716 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618718 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618721 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618723 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618726 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618729 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618731 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618734 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618737 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618740 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618744 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618747 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618750 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618753 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618756 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:41.624106 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618759 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618762 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618764 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618767 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618769 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618772 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618774 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618777 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618780 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618782 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618785 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618789 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618792 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618795 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618798 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618801 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618804 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618807 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618810 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618813 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:41.624584 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618815 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618818 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618820 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618823 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618825 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618827 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618830 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618833 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618835 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618838 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618840 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618843 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618845 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618848 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618850 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618853 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618855 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618858 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618860 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:41.625130 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618863 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:41.625687 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618868 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:41.625687 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618871 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:41.625687 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.618873 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:41.625687 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.619695 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:41.626012 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.625901 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:09:41.626044 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.626013 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:09:41.626074 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626057 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:41.626074 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626064 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:41.626074 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626068 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:41.626074 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626071 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:41.626074 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626075 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626078 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626081 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626083 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626086 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626089 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626092 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626095 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626097 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626100 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626103 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626106 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626108 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626111 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626114 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626117 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626120 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626122 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626125 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:41.626208 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626128 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626130 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626133 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626135 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626138 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626140 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626143 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626146 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626150 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626152 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626155 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626158 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626160 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626163 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626166 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626169 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626172 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626175 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626178 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626180 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:41.626773 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626183 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626185 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626188 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626191 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626193 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626196 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626198 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626201 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626204 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626206 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626209 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626211 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626214 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626217 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626219 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626222 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626224 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626227 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626230 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626232 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:41.627265 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626235 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626241 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626243 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626246 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626249 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626251 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626254 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626256 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626259 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626262 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626264 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626267 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626269 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626272 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626275 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626279 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626282 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626285 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626288 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626290 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:41.627813 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626293 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626295 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626298 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.626303 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626393 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626397 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626401 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626403 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626406 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626409 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626412 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626415 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626419 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626423 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626427 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:41.628287 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626430 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626433 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626435 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626437 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626440 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626445 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626448 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626452 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626454 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626457 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626460 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626462 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626465 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626467 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626470 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626481 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626484 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626487 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626489 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:41.628682 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626491 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626494 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626496 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626499 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626502 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626504 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626507 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626509 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626512 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626514 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626517 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626519 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626522 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626525 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626528 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626530 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626533 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626535 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626538 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626540 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:41.629137 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626543 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626545 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626548 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626551 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626553 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626556 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626558 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626561 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626563 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626566 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626568 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626571 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626573 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626576 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626578 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626581 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626583 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626585 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626588 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626590 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:41.629691 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626593 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626595 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626598 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626600 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626603 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626605 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626608 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626611 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626613 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626616 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626618 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626621 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626637 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626640 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626642 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:41.630188 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:41.626645 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.626650 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.626749 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.629063 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.629973 2568 server.go:1019] "Starting client certificate rotation" Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.630071 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:41.630563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.630104 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:41.653696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.653677 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:41.656266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.656247 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:41.675982 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.675958 2568 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:09:41.681916 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.681897 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:41.682431 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.682403 2568 log.go:25] "Validated CRI v1 image API" Apr 19 12:09:41.684330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.684314 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:09:41.689881 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.689861 2568 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a1eb09a5-f129-4fce-85d1-0226ad677d72:/dev/nvme0n1p3 c2782ff9-1ea6-4cc4-b641-bab06dc26641:/dev/nvme0n1p4] Apr 19 12:09:41.689952 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.689881 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:09:41.694909 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.694788 2568 manager.go:217] Machine: {Timestamp:2026-04-19 12:09:41.693510868 +0000 UTC m=+0.412335015 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3117259 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec274b7631819dcdee7d8fe08e83b0bc SystemUUID:ec274b76-3181-9dcd-ee7d-8fe08e83b0bc BootID:8207ff86-7695-4a03-a5ed-7ddaabe72ef5 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7a:a0:91:84:89 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7a:a0:91:84:89 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:76:5c:71:48:d6:b0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:09:41.694909 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.694901 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:09:41.695029 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.695017 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:09:41.695949 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.695925 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:09:41.696079 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.695951 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-225.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:09:41.696126 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.696091 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:09:41.696126 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.696100 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:09:41.696126 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.696113 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:41.697068 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.697058 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:41.698479 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.698462 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:41.698590 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.698581 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:09:41.701496 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.701484 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:09:41.701535 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.701499 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:09:41.701535 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.701512 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:09:41.701535 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.701522 2568 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:09:41.701535 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.701530 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:09:41.702757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.702742 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:41.702809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.702769 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:41.705940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.705921 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:09:41.707378 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.707361 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:09:41.709367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709352 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709373 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709382 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709390 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709399 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709408 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709417 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709425 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709435 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:09:41.709451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709445 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:09:41.709739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709458 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:09:41.709739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.709472 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:09:41.710216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.710205 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:09:41.710268 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.710220 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:09:41.713868 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.713853 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:09:41.713958 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.713895 2568 server.go:1295] "Started kubelet" Apr 19 12:09:41.714010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.713966 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:09:41.714124 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.714069 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:09:41.714170 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.714153 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:09:41.714849 ip-10-0-140-225 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:09:41.715292 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.715229 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:09:41.720990 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.720961 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-225.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 12:09:41.721204 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.721185 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 12:09:41.721301 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.721242 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-225.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 19 12:09:41.721854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.721836 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:09:41.722836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.722811 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5pw4l" Apr 19 12:09:41.725166 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.725146 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:41.725926 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.724974 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-225.ec2.internal.18a7c0d0f4ceac21 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-225.ec2.internal,UID:ip-10-0-140-225.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-225.ec2.internal,},FirstTimestamp:2026-04-19 12:09:41.713865761 +0000 UTC m=+0.432689908,LastTimestamp:2026-04-19 12:09:41.713865761 +0000 UTC m=+0.432689908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-225.ec2.internal,}" Apr 19 12:09:41.726008 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.725969 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:09:41.726223 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726207 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:09:41.727040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726965 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:09:41.727040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726970 2568 factory.go:153] Registering CRI-O factory Apr 19 12:09:41.727040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726965 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:09:41.727040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726992 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:09:41.727040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.726993 2568 factory.go:223] Registration of the crio container factory successfully Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727057 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727066 2568 factory.go:55] Registering systemd factory Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727074 2568 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727094 2568 factory.go:103] Registering Raw factory Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727105 2568 manager.go:1196] Started watching for new ooms in manager Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727105 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:09:41.727266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.727161 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:09:41.727738 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.727405 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:41.728668 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.728653 2568 manager.go:319] Starting recovery of all containers Apr 19 12:09:41.729591 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.729570 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5pw4l" Apr 19 12:09:41.729983 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.729956 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 19 12:09:41.730069 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.730050 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-225.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 19 12:09:41.739490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.739476 2568 manager.go:324] Recovery completed Apr 19 12:09:41.743518 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.743506 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.747657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.747639 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.747731 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.747669 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.747731 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.747680 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.748154 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.748138 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:09:41.748154 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.748151 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:09:41.748247 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.748166 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:41.750409 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.750397 2568 policy_none.go:49] "None policy: Start" Apr 19 12:09:41.750455 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.750413 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:09:41.750455 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.750423 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790235 2568 manager.go:341] "Starting Device Plugin manager" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.790295 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790305 2568 server.go:85] "Starting device plugin registration server" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790619 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790646 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790820 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790888 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.790894 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.791685 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:09:41.795434 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.791717 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:41.862396 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.862311 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:09:41.863575 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.863559 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:09:41.863664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.863592 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:09:41.863664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.863616 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:09:41.863664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.863640 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:09:41.863793 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.863686 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:09:41.866863 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.866840 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:41.891685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.891655 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.892573 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.892556 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.892667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.892583 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.892667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.892593 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.892667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.892616 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-225.ec2.internal" Apr 19 12:09:41.901319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.901305 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-225.ec2.internal" Apr 19 12:09:41.901379 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.901325 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-225.ec2.internal\": node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:41.911143 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.911122 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:41.964761 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.964739 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal"] Apr 19 12:09:41.964847 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.964815 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.966835 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.966816 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.966897 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.966850 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.966897 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.966859 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.968140 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968129 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.968265 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968252 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:41.968319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968284 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.968799 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968780 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.968799 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968794 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.968915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968807 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.968915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968817 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.968915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968830 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.968915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.968820 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.969940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.969925 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:41.969990 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.969959 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:41.970609 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.970592 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:41.970674 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.970642 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:41.970674 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:41.970659 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:41.996329 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:41.996308 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-225.ec2.internal\" not found" node="ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.000788 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.000775 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-225.ec2.internal\" not found" node="ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.011517 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.011502 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.029396 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.029373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e49faeb3a3d52b4f2890ad643fe20f27-config\") pod \"kube-apiserver-proxy-ip-10-0-140-225.ec2.internal\" (UID: \"e49faeb3a3d52b4f2890ad643fe20f27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.029483 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.029401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.029483 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.029417 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.111771 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.111739 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.130202 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e49faeb3a3d52b4f2890ad643fe20f27-config\") pod \"kube-apiserver-proxy-ip-10-0-140-225.ec2.internal\" (UID: \"e49faeb3a3d52b4f2890ad643fe20f27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.130202 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.130325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.130325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130234 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.130325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e49faeb3a3d52b4f2890ad643fe20f27-config\") pod \"kube-apiserver-proxy-ip-10-0-140-225.ec2.internal\" (UID: \"e49faeb3a3d52b4f2890ad643fe20f27\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.130325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.130253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5b9537c8e78a3a60f1b8469cd72a5209-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal\" (UID: \"5b9537c8e78a3a60f1b8469cd72a5209\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.212474 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.212456 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.299041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.299007 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.303568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.303546 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.313232 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.313214 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.413715 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.413652 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.514186 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.514153 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-225.ec2.internal\" not found" Apr 19 12:09:42.549915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.549892 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:42.618419 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.618389 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:42.626383 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.626360 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.629500 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.629482 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:09:42.629608 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.629589 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:42.629676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.629643 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:42.629735 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.629669 2568 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://af517b4f1c8bf45b6846347cfdc75ea2-18ab6751faed5dfa.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.140.225:49096->100.30.163.6:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.629735 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.629691 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" Apr 19 12:09:42.648353 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.648333 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:42.702736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.702666 2568 apiserver.go:52] "Watching apiserver" Apr 19 12:09:42.710637 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.710598 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:09:42.710960 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.710940 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h8wxf","openshift-multus/multus-additional-cni-plugins-rqgk2","openshift-image-registry/node-ca-t4c4r","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal","openshift-multus/multus-jxzs6","openshift-multus/network-metrics-daemon-7vkmz","openshift-network-diagnostics/network-check-target-xmmjm","openshift-network-operator/iptables-alerter-s2mg7","openshift-ovn-kubernetes/ovnkube-node-7t4b4","kube-system/konnectivity-agent-8gk7l","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb","openshift-cluster-node-tuning-operator/tuned-445xt"] Apr 19 12:09:42.713506 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.713487 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.714527 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.714504 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.714611 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.714577 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:42.714692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.714607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.717352 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717307 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.717448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717307 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.717448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717360 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:09:42.717555 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717478 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q8n5k\"" Apr 19 12:09:42.717555 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717499 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.717678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717561 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:09:42.717764 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717744 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.717995 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717973 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.717995 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.717987 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.718163 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.718075 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cntlb\"" Apr 19 12:09:42.718619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.718598 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:42.718721 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.718680 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:42.719376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.719355 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:09:42.719567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.719554 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:09:42.719707 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.719690 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.720075 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720060 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.720139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720093 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dnr6p\"" Apr 19 12:09:42.720139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720103 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:09:42.720409 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720390 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.720497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720453 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-59bnb\"" Apr 19 12:09:42.720957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.720942 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.721592 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.721576 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.721692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.721614 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.721692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.721667 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-t5cz6\"" Apr 19 12:09:42.722005 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.721989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:09:42.722005 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.721998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.722960 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.722946 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:09:42.723252 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.723233 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:09:42.723325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.723257 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.723325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.723266 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724275 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724279 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nd9dx\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724315 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724318 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-66msp\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724325 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724375 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724388 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:09:42.724491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.724313 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:09:42.725229 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.725213 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:09:42.725296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.725232 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:42.725296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.725290 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.725389 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.725298 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9s8vz\"" Apr 19 12:09:42.725389 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.725293 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.726356 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.726340 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:42.726446 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.726376 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:42.726446 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.726346 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qf68d\"" Apr 19 12:09:42.727653 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.727619 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:09:42.732421 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.732393 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:04:41 +0000 UTC" deadline="2027-09-28 11:32:47.71523612 +0000 UTC" Apr 19 12:09:42.732490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.732421 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12647h23m4.982819357s" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-sys\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd5k8\" (UniqueName: \"kubernetes.io/projected/b92e24a5-af38-47f4-b61e-2ec11bd04716-kube-api-access-qd5k8\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-netns\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-konnectivity-ca\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-systemd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-ovn\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.733567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-node-log\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733577 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-script-lib\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysconfig\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-var-lib-kubelet\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97cb\" (UniqueName: \"kubernetes.io/projected/674429c5-1701-4b79-a719-7de71b17fc9c-kube-api-access-d97cb\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz8jl\" (UniqueName: \"kubernetes.io/projected/29686a24-b6da-4655-8af2-679ab3a6bbbf-kube-api-access-lz8jl\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:42.733921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-sys-fs\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733948 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9csvp\" (UniqueName: \"kubernetes.io/projected/bd8d4e34-daad-49df-bacb-6940e5066abf-kube-api-access-9csvp\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-os-release\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.733997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-k8s-cni-cncf-io\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734138 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-etc-kubernetes\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-registration-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734201 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-kubernetes\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.734263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-kubelet\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734273 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/674429c5-1701-4b79-a719-7de71b17fc9c-ovn-node-metrics-cert\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9920f4d8-e6b0-4993-baa5-e254915bebae-hosts-file\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734339 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-device-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734390 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-tmp\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48nq\" (UniqueName: \"kubernetes.io/projected/9920f4d8-e6b0-4993-baa5-e254915bebae-kube-api-access-s48nq\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734432 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734555 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e45a4de-472d-4b7e-addc-01dfec69c9d8-host\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.734588 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734591 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-slash\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.734911 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734610 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.734911 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734678 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-os-release\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.734911 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734701 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzx4\" (UniqueName: \"kubernetes.io/projected/688acc9f-4a93-448c-a106-915356989bff-kube-api-access-wgzx4\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.734911 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734724 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-iptables-alerter-script\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.735149 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.734745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735181 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735160 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-tuned\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.735181 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735175 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cni-binary-copy\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735236 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-bin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735236 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9920f4d8-e6b0-4993-baa5-e254915bebae-tmp-dir\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.735236 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-host-slash\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.735236 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735233 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-netns\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735252 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735268 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-modprobe-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735281 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-run\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-system-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735318 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk5t\" (UniqueName: \"kubernetes.io/projected/4e45a4de-472d-4b7e-addc-01dfec69c9d8-kube-api-access-msk5t\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735332 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-kubelet\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-systemd-units\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735358 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-var-lib-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735410 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-cnibin\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735432 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-log-socket\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735502 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-env-overrides\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-system-cni-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735589 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-systemd\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735636 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cnibin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-multus\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-agent-certs\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-lib-modules\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735716 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-socket-dir-parent\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.735757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-hostroot\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735822 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-conf-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnbp\" (UniqueName: \"kubernetes.io/projected/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-kube-api-access-dxnbp\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735869 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-socket-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-multus-certs\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735946 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735969 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-bin\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.735997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgkc8\" (UniqueName: \"kubernetes.io/projected/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-kube-api-access-mgkc8\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-config\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.736228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736063 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-conf\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e45a4de-472d-4b7e-addc-01dfec69c9d8-serviceca\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-netd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-host\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736508 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-daemon-config\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.736685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.736688 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-etc-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.768175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.768151 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t9xlv" Apr 19 12:09:42.774096 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.774080 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t9xlv" Apr 19 12:09:42.837225 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837171 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-systemd\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.837225 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cnibin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-multus\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-agent-certs\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.837381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837317 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-systemd\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.837381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-lib-modules\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.837381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837369 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-multus\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cnibin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-socket-dir-parent\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837425 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-hostroot\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-lib-modules\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837455 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-conf-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnbp\" (UniqueName: \"kubernetes.io/projected/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-kube-api-access-dxnbp\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-hostroot\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-socket-dir-parent\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837555 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-socket-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837562 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-conf-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.837596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-multus-certs\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-kubelet-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-multus-certs\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837678 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837688 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-socket-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837725 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-bin\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837730 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837756 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mgkc8\" (UniqueName: \"kubernetes.io/projected/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-kube-api-access-mgkc8\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837774 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-bin\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837760 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-config\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-conf\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e45a4de-472d-4b7e-addc-01dfec69c9d8-serviceca\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-netd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837924 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-cni-netd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.837978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-conf\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-host\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838066 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-daemon-config\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838091 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-host\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838152 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-etc-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838193 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-sys\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qd5k8\" (UniqueName: \"kubernetes.io/projected/b92e24a5-af38-47f4-b61e-2ec11bd04716-kube-api-access-qd5k8\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-sys\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-config\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838504 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-netns\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838389 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-etc-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-netns\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-konnectivity-ca\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-systemd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-ovn\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-node-log\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-script-lib\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838686 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-daemon-config\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838688 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.838987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysconfig\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-var-lib-kubelet\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838779 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-systemd\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838347 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4e45a4de-472d-4b7e-addc-01dfec69c9d8-serviceca\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-etc-selinux\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d97cb\" (UniqueName: \"kubernetes.io/projected/674429c5-1701-4b79-a719-7de71b17fc9c-kube-api-access-d97cb\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-binary-copy\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838830 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz8jl\" (UniqueName: \"kubernetes.io/projected/29686a24-b6da-4655-8af2-679ab3a6bbbf-kube-api-access-lz8jl\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysconfig\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838890 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-var-lib-kubelet\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-node-log\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.838998 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-ovn\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-sys-fs\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-sys-fs\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9csvp\" (UniqueName: \"kubernetes.io/projected/bd8d4e34-daad-49df-bacb-6940e5066abf-kube-api-access-9csvp\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.839828 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-os-release\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-k8s-cni-cncf-io\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839212 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-etc-kubernetes\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839291 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-os-release\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-registration-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839373 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-kubernetes\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-etc-kubernetes\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839377 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-konnectivity-ca\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-kubelet\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-run-k8s-cni-cncf-io\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-kubelet\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839476 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-registration-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/674429c5-1701-4b79-a719-7de71b17fc9c-ovn-node-metrics-cert\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839510 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9920f4d8-e6b0-4993-baa5-e254915bebae-hosts-file\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-kubernetes\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839471 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-ovnkube-script-lib\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.839540 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:42.840736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839544 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-device-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-tmp\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9920f4d8-e6b0-4993-baa5-e254915bebae-hosts-file\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.839597 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:09:43.339576443 +0000 UTC m=+2.058400575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s48nq\" (UniqueName: \"kubernetes.io/projected/9920f4d8-e6b0-4993-baa5-e254915bebae-kube-api-access-s48nq\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.839647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd8d4e34-daad-49df-bacb-6940e5066abf-device-dir\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840206 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e45a4de-472d-4b7e-addc-01dfec69c9d8-host\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-slash\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840372 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-os-release\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840482 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzx4\" (UniqueName: \"kubernetes.io/projected/688acc9f-4a93-448c-a106-915356989bff-kube-api-access-wgzx4\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840565 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-iptables-alerter-script\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840695 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.841502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.840906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fefc3e1e-c62c-487a-a8d5-2c56a47bb505-agent-certs\") pod \"konnectivity-agent-8gk7l\" (UID: \"fefc3e1e-c62c-487a-a8d5-2c56a47bb505\") " pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.841540 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/688acc9f-4a93-448c-a106-915356989bff-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.841549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-sysctl-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.841645 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-slash\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.841748 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-os-release\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.841808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4e45a4de-472d-4b7e-addc-01dfec69c9d8-host\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.842280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842215 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.842549 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-tuned\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.842549 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-iptables-alerter-script\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.842660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cni-binary-copy\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.842956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842935 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-tmp\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.843259 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.842829 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/674429c5-1701-4b79-a719-7de71b17fc9c-ovn-node-metrics-cert\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.843810 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.843788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-cni-binary-copy\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.845084 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-bin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.845250 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845142 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-host-var-lib-cni-bin\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.845358 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9920f4d8-e6b0-4993-baa5-e254915bebae-tmp-dir\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.845454 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-host-slash\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.845521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-netns\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.845521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-run-netns\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.845619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845543 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-host-slash\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.845619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.845619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9920f4d8-e6b0-4993-baa5-e254915bebae-tmp-dir\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-tuned\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-modprobe-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-multus-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845716 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-run\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-run\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.845790 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845786 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-system-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.846023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b92e24a5-af38-47f4-b61e-2ec11bd04716-etc-modprobe-d\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.846023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845835 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msk5t\" (UniqueName: \"kubernetes.io/projected/4e45a4de-472d-4b7e-addc-01dfec69c9d8-kube-api-access-msk5t\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.846023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-system-cni-dir\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.846023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.845961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-kubelet\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-systemd-units\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846186 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-var-lib-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846186 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846106 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-cnibin\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.846433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-host-kubelet\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-cnibin\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.846433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846415 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-var-lib-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-systemd-units\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846424 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-log-socket\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-env-overrides\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-system-cni-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-run-openvswitch\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846535 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/674429c5-1701-4b79-a719-7de71b17fc9c-log-socket\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.846652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/688acc9f-4a93-448c-a106-915356989bff-system-cni-dir\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.846880 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.846865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/674429c5-1701-4b79-a719-7de71b17fc9c-env-overrides\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.849049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.849024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgkc8\" (UniqueName: \"kubernetes.io/projected/f35f6e45-1295-40e6-a620-e3a0f9a2dd05-kube-api-access-mgkc8\") pod \"multus-jxzs6\" (UID: \"f35f6e45-1295-40e6-a620-e3a0f9a2dd05\") " pod="openshift-multus/multus-jxzs6" Apr 19 12:09:42.850477 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.850461 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:42.850533 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.850479 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:42.850533 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.850490 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:42.850591 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:42.850548 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:43.350532716 +0000 UTC m=+2.069356884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:42.852972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.852946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9csvp\" (UniqueName: \"kubernetes.io/projected/bd8d4e34-daad-49df-bacb-6940e5066abf-kube-api-access-9csvp\") pod \"aws-ebs-csi-driver-node-smmnb\" (UID: \"bd8d4e34-daad-49df-bacb-6940e5066abf\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:42.853437 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.853416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk5t\" (UniqueName: \"kubernetes.io/projected/4e45a4de-472d-4b7e-addc-01dfec69c9d8-kube-api-access-msk5t\") pod \"node-ca-t4c4r\" (UID: \"4e45a4de-472d-4b7e-addc-01dfec69c9d8\") " pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:42.853889 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.853867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzx4\" (UniqueName: \"kubernetes.io/projected/688acc9f-4a93-448c-a106-915356989bff-kube-api-access-wgzx4\") pod \"multus-additional-cni-plugins-rqgk2\" (UID: \"688acc9f-4a93-448c-a106-915356989bff\") " pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:42.854056 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.854027 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97cb\" (UniqueName: \"kubernetes.io/projected/674429c5-1701-4b79-a719-7de71b17fc9c-kube-api-access-d97cb\") pod \"ovnkube-node-7t4b4\" (UID: \"674429c5-1701-4b79-a719-7de71b17fc9c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:42.854203 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.854181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48nq\" (UniqueName: \"kubernetes.io/projected/9920f4d8-e6b0-4993-baa5-e254915bebae-kube-api-access-s48nq\") pod \"node-resolver-h8wxf\" (UID: \"9920f4d8-e6b0-4993-baa5-e254915bebae\") " pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:42.854417 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.854391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnbp\" (UniqueName: \"kubernetes.io/projected/92f813a8-85b5-4872-b0d8-1b7ab33b4e86-kube-api-access-dxnbp\") pod \"iptables-alerter-s2mg7\" (UID: \"92f813a8-85b5-4872-b0d8-1b7ab33b4e86\") " pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:42.854676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.854659 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz8jl\" (UniqueName: \"kubernetes.io/projected/29686a24-b6da-4655-8af2-679ab3a6bbbf-kube-api-access-lz8jl\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:42.854829 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.854814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd5k8\" (UniqueName: \"kubernetes.io/projected/b92e24a5-af38-47f4-b61e-2ec11bd04716-kube-api-access-qd5k8\") pod \"tuned-445xt\" (UID: \"b92e24a5-af38-47f4-b61e-2ec11bd04716\") " pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:42.894270 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:42.894238 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9537c8e78a3a60f1b8469cd72a5209.slice/crio-a0f64e56f40310ceda3b04f426ac90ebea8680594071dd1022d94035abfb70f3 WatchSource:0}: Error finding container a0f64e56f40310ceda3b04f426ac90ebea8680594071dd1022d94035abfb70f3: Status 404 returned error can't find the container with id a0f64e56f40310ceda3b04f426ac90ebea8680594071dd1022d94035abfb70f3 Apr 19 12:09:42.894654 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:42.894603 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode49faeb3a3d52b4f2890ad643fe20f27.slice/crio-56d7fce75d5e7a3b907576b73508d8bf687aece096e98d7c636fba780bf0d328 WatchSource:0}: Error finding container 56d7fce75d5e7a3b907576b73508d8bf687aece096e98d7c636fba780bf0d328: Status 404 returned error can't find the container with id 56d7fce75d5e7a3b907576b73508d8bf687aece096e98d7c636fba780bf0d328 Apr 19 12:09:42.900708 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:42.900692 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:09:43.050457 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.050356 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jxzs6" Apr 19 12:09:43.056411 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.056381 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35f6e45_1295_40e6_a620_e3a0f9a2dd05.slice/crio-fa474b76e85c7b553fb732770fd8265a99fb5a49990157b5527418a48a3fa900 WatchSource:0}: Error finding container fa474b76e85c7b553fb732770fd8265a99fb5a49990157b5527418a48a3fa900: Status 404 returned error can't find the container with id fa474b76e85c7b553fb732770fd8265a99fb5a49990157b5527418a48a3fa900 Apr 19 12:09:43.066469 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.066445 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h8wxf" Apr 19 12:09:43.072297 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.072276 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9920f4d8_e6b0_4993_baa5_e254915bebae.slice/crio-959a53d3f3bfc33ae8a77849475915a5341e05bc7a2a3b672a7470262e969bfb WatchSource:0}: Error finding container 959a53d3f3bfc33ae8a77849475915a5341e05bc7a2a3b672a7470262e969bfb: Status 404 returned error can't find the container with id 959a53d3f3bfc33ae8a77849475915a5341e05bc7a2a3b672a7470262e969bfb Apr 19 12:09:43.082206 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.082189 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-t4c4r" Apr 19 12:09:43.086776 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.086755 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" Apr 19 12:09:43.088144 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.088116 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e45a4de_472d_4b7e_addc_01dfec69c9d8.slice/crio-4d1e7f256ceef104defe7845334847a7a949b2e8b9e9c71366d1cfa532f4fbdb WatchSource:0}: Error finding container 4d1e7f256ceef104defe7845334847a7a949b2e8b9e9c71366d1cfa532f4fbdb: Status 404 returned error can't find the container with id 4d1e7f256ceef104defe7845334847a7a949b2e8b9e9c71366d1cfa532f4fbdb Apr 19 12:09:43.093824 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.093804 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod688acc9f_4a93_448c_a106_915356989bff.slice/crio-ad07ef86f9e6da145ddb2b90ac297c9529addfd02f02b8b6f03b294012319b47 WatchSource:0}: Error finding container ad07ef86f9e6da145ddb2b90ac297c9529addfd02f02b8b6f03b294012319b47: Status 404 returned error can't find the container with id ad07ef86f9e6da145ddb2b90ac297c9529addfd02f02b8b6f03b294012319b47 Apr 19 12:09:43.100689 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.100673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2mg7" Apr 19 12:09:43.106518 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.106498 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92f813a8_85b5_4872_b0d8_1b7ab33b4e86.slice/crio-9a61aeefb4b33c0803c41a6911116d68a57985c45b5d589bfb86156f52b2b236 WatchSource:0}: Error finding container 9a61aeefb4b33c0803c41a6911116d68a57985c45b5d589bfb86156f52b2b236: Status 404 returned error can't find the container with id 9a61aeefb4b33c0803c41a6911116d68a57985c45b5d589bfb86156f52b2b236 Apr 19 12:09:43.115089 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.115074 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:09:43.120803 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.120782 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod674429c5_1701_4b79_a719_7de71b17fc9c.slice/crio-5ac863053460adac255428c99eae01b3626fa4cd4ea455a00f5a1afe03783d76 WatchSource:0}: Error finding container 5ac863053460adac255428c99eae01b3626fa4cd4ea455a00f5a1afe03783d76: Status 404 returned error can't find the container with id 5ac863053460adac255428c99eae01b3626fa4cd4ea455a00f5a1afe03783d76 Apr 19 12:09:43.120881 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.120847 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:43.126697 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.126677 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefc3e1e_c62c_487a_a8d5_2c56a47bb505.slice/crio-e46bec8bab0af2cacbddb9a97975ae57d05cfdae59f51299b1e3d8523a17dc13 WatchSource:0}: Error finding container e46bec8bab0af2cacbddb9a97975ae57d05cfdae59f51299b1e3d8523a17dc13: Status 404 returned error can't find the container with id e46bec8bab0af2cacbddb9a97975ae57d05cfdae59f51299b1e3d8523a17dc13 Apr 19 12:09:43.134650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.134613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" Apr 19 12:09:43.140734 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.140714 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8d4e34_daad_49df_bacb_6940e5066abf.slice/crio-6a6ff31319af230b5fbfdb1a43f309b8bc006e9b6506c029e35fab2150e70da2 WatchSource:0}: Error finding container 6a6ff31319af230b5fbfdb1a43f309b8bc006e9b6506c029e35fab2150e70da2: Status 404 returned error can't find the container with id 6a6ff31319af230b5fbfdb1a43f309b8bc006e9b6506c029e35fab2150e70da2 Apr 19 12:09:43.143114 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.143100 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-445xt" Apr 19 12:09:43.148717 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:09:43.148697 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92e24a5_af38_47f4_b61e_2ec11bd04716.slice/crio-5b26d0efd9afe411e76d785dedd2ab897bf5eaece971e61028b954e171336ed7 WatchSource:0}: Error finding container 5b26d0efd9afe411e76d785dedd2ab897bf5eaece971e61028b954e171336ed7: Status 404 returned error can't find the container with id 5b26d0efd9afe411e76d785dedd2ab897bf5eaece971e61028b954e171336ed7 Apr 19 12:09:43.277264 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.277239 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:43.349962 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.349874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:43.350116 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.349994 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:43.350116 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.350059 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:09:44.350040388 +0000 UTC m=+3.068864523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:43.451154 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.451115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:43.451313 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.451268 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:43.451313 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.451285 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:43.451313 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.451298 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:43.451435 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:43.451353 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:44.451337802 +0000 UTC m=+3.170161934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:43.532862 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.532834 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:43.775766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.775681 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:42 +0000 UTC" deadline="2027-11-23 00:11:23.279054475 +0000 UTC" Apr 19 12:09:43.775766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.775718 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13980h1m39.503341276s" Apr 19 12:09:43.882532 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.882444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h8wxf" event={"ID":"9920f4d8-e6b0-4993-baa5-e254915bebae","Type":"ContainerStarted","Data":"959a53d3f3bfc33ae8a77849475915a5341e05bc7a2a3b672a7470262e969bfb"} Apr 19 12:09:43.899384 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.899353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxzs6" event={"ID":"f35f6e45-1295-40e6-a620-e3a0f9a2dd05","Type":"ContainerStarted","Data":"fa474b76e85c7b553fb732770fd8265a99fb5a49990157b5527418a48a3fa900"} Apr 19 12:09:43.926597 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.926556 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-445xt" event={"ID":"b92e24a5-af38-47f4-b61e-2ec11bd04716","Type":"ContainerStarted","Data":"5b26d0efd9afe411e76d785dedd2ab897bf5eaece971e61028b954e171336ed7"} Apr 19 12:09:43.949740 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.949709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8gk7l" event={"ID":"fefc3e1e-c62c-487a-a8d5-2c56a47bb505","Type":"ContainerStarted","Data":"e46bec8bab0af2cacbddb9a97975ae57d05cfdae59f51299b1e3d8523a17dc13"} Apr 19 12:09:43.970192 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.970162 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" event={"ID":"5b9537c8e78a3a60f1b8469cd72a5209","Type":"ContainerStarted","Data":"a0f64e56f40310ceda3b04f426ac90ebea8680594071dd1022d94035abfb70f3"} Apr 19 12:09:43.990179 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:43.990106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" event={"ID":"e49faeb3a3d52b4f2890ad643fe20f27","Type":"ContainerStarted","Data":"56d7fce75d5e7a3b907576b73508d8bf687aece096e98d7c636fba780bf0d328"} Apr 19 12:09:44.005944 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.005906 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" event={"ID":"bd8d4e34-daad-49df-bacb-6940e5066abf","Type":"ContainerStarted","Data":"6a6ff31319af230b5fbfdb1a43f309b8bc006e9b6506c029e35fab2150e70da2"} Apr 19 12:09:44.027996 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.027925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"5ac863053460adac255428c99eae01b3626fa4cd4ea455a00f5a1afe03783d76"} Apr 19 12:09:44.041077 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.041042 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2mg7" event={"ID":"92f813a8-85b5-4872-b0d8-1b7ab33b4e86","Type":"ContainerStarted","Data":"9a61aeefb4b33c0803c41a6911116d68a57985c45b5d589bfb86156f52b2b236"} Apr 19 12:09:44.055683 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.055656 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerStarted","Data":"ad07ef86f9e6da145ddb2b90ac297c9529addfd02f02b8b6f03b294012319b47"} Apr 19 12:09:44.070800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.070776 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t4c4r" event={"ID":"4e45a4de-472d-4b7e-addc-01dfec69c9d8","Type":"ContainerStarted","Data":"4d1e7f256ceef104defe7845334847a7a949b2e8b9e9c71366d1cfa532f4fbdb"} Apr 19 12:09:44.359163 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.359076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:44.359322 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.359225 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:44.359322 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.359290 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:09:46.359268592 +0000 UTC m=+5.078092728 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:44.459842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.459806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:44.460349 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.460151 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:44.460349 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.460182 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:44.460349 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.460197 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:44.460349 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.460267 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:46.460238488 +0000 UTC m=+5.179062634 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:44.778692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.776920 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:42 +0000 UTC" deadline="2027-11-02 13:22:10.887848995 +0000 UTC" Apr 19 12:09:44.778692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.776957 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13489h12m26.110895081s" Apr 19 12:09:44.864315 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.864282 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:44.864462 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.864401 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:44.864876 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.864855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:44.864984 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:44.864963 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:44.924878 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:44.924847 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:46.385087 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.385049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:46.385525 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.385205 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:46.385525 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.385274 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:09:50.385254247 +0000 UTC m=+9.104078382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:46.485673 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.485556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:46.485841 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.485755 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:46.485841 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.485776 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:46.485841 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.485788 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:46.485992 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.485845 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:50.485825632 +0000 UTC m=+9.204649779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:46.692646 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.692570 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jlktl"] Apr 19 12:09:46.694370 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.694337 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.694510 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.694422 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:46.789041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.788848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-dbus\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.789041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.788908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-kubelet-config\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.789041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.788962 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.864974 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.864290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:46.864974 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.864425 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:46.864974 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.864841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:46.864974 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.864926 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:46.890260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.890030 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.890260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.890103 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-dbus\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.890260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.890142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-kubelet-config\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.890260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.890223 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-dbus\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:46.890260 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.890238 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:46.890644 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:46.890307 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:09:47.390288065 +0000 UTC m=+6.109112213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:46.890644 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:46.890238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/64f68a15-86a3-4526-a8c3-2d66d94b763f-kubelet-config\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:47.395296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:47.395244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:47.395769 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:47.395439 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:47.395769 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:47.395498 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:09:48.395480474 +0000 UTC m=+7.114304622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:48.403210 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:48.403183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:48.403618 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:48.403357 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:48.403618 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:48.403403 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:09:50.403389618 +0000 UTC m=+9.122213756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:48.864486 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:48.864538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:48.864613 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:48.864677 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:48.864785 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:48.864988 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:48.864864 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:50.417541 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.417492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:50.418162 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.417567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:50.418162 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.417690 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:50.418162 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.417735 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:50.418162 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.417776 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:09:54.417753739 +0000 UTC m=+13.136577888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:50.418162 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.417798 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:09:58.417787572 +0000 UTC m=+17.136611709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:50.518543 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.518500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:50.518724 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.518680 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:50.518724 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.518701 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:50.518724 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.518715 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:50.518865 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.518779 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:58.51876035 +0000 UTC m=+17.237584486 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.864174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.864180 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.864300 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.864397 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:50.864442 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:50.864659 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:50.864508 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:52.864388 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:52.864355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:52.864821 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:52.864393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:52.864821 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:52.864448 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:52.864821 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:52.864566 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:52.865026 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:52.864982 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:52.865153 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:52.865112 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:54.095863 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.095604 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerStarted","Data":"33e87e9e1cf5b62c86d737435f1923b7e12c57d9a0684cfb49f17b4942a0f6a4"} Apr 19 12:09:54.097529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.097499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-t4c4r" event={"ID":"4e45a4de-472d-4b7e-addc-01dfec69c9d8","Type":"ContainerStarted","Data":"5abbbd5064f60b989470ff0a7c3a48666f0ef74e3d6e9eeac56c853173b7c3da"} Apr 19 12:09:54.099099 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.099045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h8wxf" event={"ID":"9920f4d8-e6b0-4993-baa5-e254915bebae","Type":"ContainerStarted","Data":"8cab8714b3059c40a6f1aed3d00c01feeaf23a923769db3af1f40b8ede60bf0b"} Apr 19 12:09:54.100694 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.100660 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-445xt" event={"ID":"b92e24a5-af38-47f4-b61e-2ec11bd04716","Type":"ContainerStarted","Data":"c4f92e8ca703eea8d9668d47589991e0f386d3f09f048ecf6c633f98e173ce0b"} Apr 19 12:09:54.102067 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.102039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8gk7l" event={"ID":"fefc3e1e-c62c-487a-a8d5-2c56a47bb505","Type":"ContainerStarted","Data":"c366e7db633d87c006beaea6cf50eb28c3b42e7162e35a9f7dbc1d8e13aa6469"} Apr 19 12:09:54.103587 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.103557 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" event={"ID":"5b9537c8e78a3a60f1b8469cd72a5209","Type":"ContainerStarted","Data":"e7a02895fca1e4579e5e42b9af240de02ee1f1d50c73c7bcaf641f06c74fe6c2"} Apr 19 12:09:54.105073 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.105044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" event={"ID":"e49faeb3a3d52b4f2890ad643fe20f27","Type":"ContainerStarted","Data":"b0281ef124fadb4df1ac347c3def3bef0edbeaaf04dc4018ca2905115c825f9d"} Apr 19 12:09:54.105227 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.105206 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:54.106335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.106314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" event={"ID":"bd8d4e34-daad-49df-bacb-6940e5066abf","Type":"ContainerStarted","Data":"d69f8de9ace60bde85e51b9be631e276857a1b4b3c4cf386f6f8622afdecc3ba"} Apr 19 12:09:54.112882 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.112867 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:54.113439 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.113423 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal"] Apr 19 12:09:54.129735 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.129694 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" podStartSLOduration=0.129682934 podStartE2EDuration="129.682934ms" podCreationTimestamp="2026-04-19 12:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:09:54.129282905 +0000 UTC m=+12.848107053" watchObservedRunningTime="2026-04-19 12:09:54.129682934 +0000 UTC m=+12.848507088" Apr 19 12:09:54.153813 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.153762 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8gk7l" podStartSLOduration=2.311282691 podStartE2EDuration="12.15374844s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.128120233 +0000 UTC m=+1.846944366" lastFinishedPulling="2026-04-19 12:09:52.970585975 +0000 UTC m=+11.689410115" observedRunningTime="2026-04-19 12:09:54.153693366 +0000 UTC m=+12.872517531" watchObservedRunningTime="2026-04-19 12:09:54.15374844 +0000 UTC m=+12.872572596" Apr 19 12:09:54.166217 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.166176 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h8wxf" podStartSLOduration=2.27312961 podStartE2EDuration="12.166161616s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.074335872 +0000 UTC m=+1.793160008" lastFinishedPulling="2026-04-19 12:09:52.967367866 +0000 UTC m=+11.686192014" observedRunningTime="2026-04-19 12:09:54.165957207 +0000 UTC m=+12.884781426" watchObservedRunningTime="2026-04-19 12:09:54.166161616 +0000 UTC m=+12.884985770" Apr 19 12:09:54.178674 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.178612 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-t4c4r" podStartSLOduration=2.298719685 podStartE2EDuration="12.178598982s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.090262801 +0000 UTC m=+1.809086934" lastFinishedPulling="2026-04-19 12:09:52.970142089 +0000 UTC m=+11.688966231" observedRunningTime="2026-04-19 12:09:54.178235239 +0000 UTC m=+12.897059395" watchObservedRunningTime="2026-04-19 12:09:54.178598982 +0000 UTC m=+12.897423138" Apr 19 12:09:54.192879 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.192844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-445xt" podStartSLOduration=2.333614229 podStartE2EDuration="12.192832285s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.15008737 +0000 UTC m=+1.868911502" lastFinishedPulling="2026-04-19 12:09:53.009305421 +0000 UTC m=+11.728129558" observedRunningTime="2026-04-19 12:09:54.192434105 +0000 UTC m=+12.911258259" watchObservedRunningTime="2026-04-19 12:09:54.192832285 +0000 UTC m=+12.911656442" Apr 19 12:09:54.451010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.450980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:54.451163 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:54.451123 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:54.451220 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:54.451182 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:02.451167954 +0000 UTC m=+21.169992087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:54.864596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.864513 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:54.864596 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.864589 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:54.864817 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:54.864714 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:54.864817 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:54.864724 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:54.864817 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:54.864800 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:54.864960 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:54.864857 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:55.109445 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:55.109388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2mg7" event={"ID":"92f813a8-85b5-4872-b0d8-1b7ab33b4e86","Type":"ContainerStarted","Data":"bb882c969a6c70df0532ff5e2953616f7d05288e487236783882760504e131ea"} Apr 19 12:09:55.110052 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:55.109887 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:55.118113 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:55.118060 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:55.118113 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:55.118105 2568 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-proxy-ip-10-0-140-225.ec2.internal\" already exists" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-225.ec2.internal" Apr 19 12:09:55.122495 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:55.122450 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s2mg7" podStartSLOduration=3.262845382 podStartE2EDuration="13.122432496s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.107898953 +0000 UTC m=+1.826723087" lastFinishedPulling="2026-04-19 12:09:52.967486057 +0000 UTC m=+11.686310201" observedRunningTime="2026-04-19 12:09:55.122085268 +0000 UTC m=+13.840909424" watchObservedRunningTime="2026-04-19 12:09:55.122432496 +0000 UTC m=+13.841256649" Apr 19 12:09:56.864311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:56.864278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:56.864832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:56.864275 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:56.864832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:56.864276 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:56.864832 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:56.864476 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:09:56.864832 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:56.864425 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:56.864832 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:56.864521 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:57.494266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:57.494229 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:57.495175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:57.495147 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:57.736436 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:57.736409 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:57.737029 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:57.737010 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8gk7l" Apr 19 12:09:58.117112 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.117079 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="33e87e9e1cf5b62c86d737435f1923b7e12c57d9a0684cfb49f17b4942a0f6a4" exitCode=0 Apr 19 12:09:58.117506 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.117167 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"33e87e9e1cf5b62c86d737435f1923b7e12c57d9a0684cfb49f17b4942a0f6a4"} Apr 19 12:09:58.118752 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.118690 2568 generic.go:358] "Generic (PLEG): container finished" podID="5b9537c8e78a3a60f1b8469cd72a5209" containerID="e7a02895fca1e4579e5e42b9af240de02ee1f1d50c73c7bcaf641f06c74fe6c2" exitCode=0 Apr 19 12:09:58.118813 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.118755 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" event={"ID":"5b9537c8e78a3a60f1b8469cd72a5209","Type":"ContainerDied","Data":"e7a02895fca1e4579e5e42b9af240de02ee1f1d50c73c7bcaf641f06c74fe6c2"} Apr 19 12:09:58.482951 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.482922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:58.483185 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.483082 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:58.483185 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.483150 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:10:14.483129123 +0000 UTC m=+33.201953267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:58.583834 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.583796 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:58.583990 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.583971 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:58.584042 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.583999 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:58.584042 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.584014 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:58.584142 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.584076 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:14.584055999 +0000 UTC m=+33.302880142 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:58.864237 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.864159 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:09:58.864237 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.864193 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:09:58.864237 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:09:58.864159 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:09:58.864521 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.864299 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:09:58.864521 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.864380 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:09:58.864610 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:09:58.864514 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:00.863867 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:00.863825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:00.864328 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:00.863825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:00.864328 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:00.863826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:00.864328 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:00.863965 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:00.864328 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:00.864034 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:00.864328 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:00.864132 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:02.292667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.292476 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:10:02.513841 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.513814 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:02.513971 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:02.513958 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:10:02.514034 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:02.514009 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret podName:64f68a15-86a3-4526-a8c3-2d66d94b763f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.513993212 +0000 UTC m=+37.232817345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret") pod "global-pull-secret-syncer-jlktl" (UID: "64f68a15-86a3-4526-a8c3-2d66d94b763f") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:10:02.800556 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.800440 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:10:02.292664512Z","UUID":"a1356c44-6b59-4ce7-add2-ab86fad84eea","Handler":null,"Name":"","Endpoint":""} Apr 19 12:10:02.802274 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.802252 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:10:02.802274 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.802280 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:10:02.864856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.864827 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:02.864856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.864845 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:02.865030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:02.864828 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:02.865030 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:02.864939 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:02.865138 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:02.865031 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:02.865202 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:02.865161 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:03.132324 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.132250 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" event={"ID":"bd8d4e34-daad-49df-bacb-6940e5066abf","Type":"ContainerStarted","Data":"0a31061893534f3a268cfd8cbfc89fbf7cca62a05580cc7cc1254a379b6d5f77"} Apr 19 12:10:03.135277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135248 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"92d73733eaffaf75aa71ab1045c8ce5d9d9319c962905be49805422aee20a1c1"} Apr 19 12:10:03.135401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135284 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"5203df245ce90ff88667608f9b5274b3c761af0f5db6d4e80803eafee2c11ef7"} Apr 19 12:10:03.135401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"93fc91da196d30333b710d6bcffe8a75fb25f4282483da1540a6f06403c9830f"} Apr 19 12:10:03.135401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"9ae48879c69c57d10952d297fda295c45c6d111f75892022dc18439f1f036856"} Apr 19 12:10:03.135401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135318 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"8fcef65256e6899af4f3a4918c4e5e7f5396a817097223a21d633c30d1022d35"} Apr 19 12:10:03.135401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.135330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"3c99091591bd4576480cd8037b2339694709eb4b98513c919142fce600e86472"} Apr 19 12:10:03.136630 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.136603 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jxzs6" event={"ID":"f35f6e45-1295-40e6-a620-e3a0f9a2dd05","Type":"ContainerStarted","Data":"13e6b356aa74fa0d86c7223fd922c746e509efb6daddd822097cb97f048f57a4"} Apr 19 12:10:03.138678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.138657 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" event={"ID":"5b9537c8e78a3a60f1b8469cd72a5209","Type":"ContainerStarted","Data":"c5d0972bcb28cd3ae13ca96778f19c49498f8dd86b01fe4849b6932210734db7"} Apr 19 12:10:03.152868 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.152824 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jxzs6" podStartSLOduration=3.163343017 podStartE2EDuration="22.152807943s" podCreationTimestamp="2026-04-19 12:09:41 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.057985137 +0000 UTC m=+1.776809269" lastFinishedPulling="2026-04-19 12:10:02.047450062 +0000 UTC m=+20.766274195" observedRunningTime="2026-04-19 12:10:03.15228893 +0000 UTC m=+21.871113096" watchObservedRunningTime="2026-04-19 12:10:03.152807943 +0000 UTC m=+21.871632100" Apr 19 12:10:03.165551 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:03.165496 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-225.ec2.internal" podStartSLOduration=21.165480515 podStartE2EDuration="21.165480515s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:10:03.164693062 +0000 UTC m=+21.883517218" watchObservedRunningTime="2026-04-19 12:10:03.165480515 +0000 UTC m=+21.884304685" Apr 19 12:10:04.142644 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:04.142590 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" event={"ID":"bd8d4e34-daad-49df-bacb-6940e5066abf","Type":"ContainerStarted","Data":"082ba50bf0daa31461673b9e988ce5734fcdca489f15495973f8c128d8ea22c9"} Apr 19 12:10:04.159016 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:04.158976 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-smmnb" podStartSLOduration=2.024735906 podStartE2EDuration="22.158963023s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.141786524 +0000 UTC m=+1.860610660" lastFinishedPulling="2026-04-19 12:10:03.276013642 +0000 UTC m=+21.994837777" observedRunningTime="2026-04-19 12:10:04.15876002 +0000 UTC m=+22.877584178" watchObservedRunningTime="2026-04-19 12:10:04.158963023 +0000 UTC m=+22.877787177" Apr 19 12:10:04.864007 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:04.863920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:04.864172 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:04.863923 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:04.864172 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:04.864031 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:04.864172 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:04.863923 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:04.864172 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:04.864092 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:04.864172 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:04.864150 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:05.147999 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:05.147918 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"d9d074706ba111abe52f819b346f6d63d5b5b2a306dd33bd00433e953923648b"} Apr 19 12:10:06.864136 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:06.864104 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:06.864541 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:06.864104 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:06.864541 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:06.864235 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:06.864541 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:06.864302 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:06.864541 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:06.864104 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:06.864541 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:06.864404 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:07.155188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:07.155148 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" event={"ID":"674429c5-1701-4b79-a719-7de71b17fc9c","Type":"ContainerStarted","Data":"0fa18acb78785891185273c3b5fb605487998be840daff1dc3ad4ca08fb0a213"} Apr 19 12:10:07.155536 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:07.155513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:07.171775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:07.171755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:07.221182 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:07.219727 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" podStartSLOduration=6.243505082 podStartE2EDuration="25.219709637s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.123578801 +0000 UTC m=+1.842402937" lastFinishedPulling="2026-04-19 12:10:02.099783341 +0000 UTC m=+20.818607492" observedRunningTime="2026-04-19 12:10:07.194177294 +0000 UTC m=+25.913001450" watchObservedRunningTime="2026-04-19 12:10:07.219709637 +0000 UTC m=+25.938533794" Apr 19 12:10:08.157006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.156977 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 12:10:08.157582 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.157547 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:08.173083 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.173054 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:08.634749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.634721 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jlktl"] Apr 19 12:10:08.634903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.634844 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:08.634995 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:08.634974 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:08.638154 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.638125 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vkmz"] Apr 19 12:10:08.638277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.638239 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:08.638362 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:08.638341 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:08.638822 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.638802 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmmjm"] Apr 19 12:10:08.638917 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:08.638904 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:08.639014 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:08.638995 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:09.160362 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:09.160284 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="536e13bcd5075aeb0e90a81d119ac156d5010ea849abc0504a0fb415890e55f8" exitCode=0 Apr 19 12:10:09.160362 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:09.160352 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"536e13bcd5075aeb0e90a81d119ac156d5010ea849abc0504a0fb415890e55f8"} Apr 19 12:10:09.160896 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:09.160506 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 12:10:09.573021 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:09.572987 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:09.864967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:09.864887 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:09.865129 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:09.865003 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:10.864712 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:10.864512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:10.865030 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:10.864777 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:10.865030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:10.864527 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:10.865030 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:10.864920 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:11.166031 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:11.165955 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="062a1e8e49683a225723666a9e7f178b9c5a761a488fef3d61a20ff4d87b674f" exitCode=0 Apr 19 12:10:11.166161 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:11.166030 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"062a1e8e49683a225723666a9e7f178b9c5a761a488fef3d61a20ff4d87b674f"} Apr 19 12:10:11.176751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:11.176706 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" podUID="674429c5-1701-4b79-a719-7de71b17fc9c" containerName="ovnkube-controller" probeResult="failure" output="" Apr 19 12:10:11.867676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:11.867644 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:11.868308 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:11.867778 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:12.170142 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:12.170116 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerStarted","Data":"afa996035c390e843d3ac43877d07ca9d1101a29ae0b5aec9d46cff4bf09a471"} Apr 19 12:10:12.864093 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:12.864061 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:12.864093 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:12.864092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:12.864280 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:12.864166 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vkmz" podUID="29686a24-b6da-4655-8af2-679ab3a6bbbf" Apr 19 12:10:12.864318 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:12.864271 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jlktl" podUID="64f68a15-86a3-4526-a8c3-2d66d94b763f" Apr 19 12:10:13.174374 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:13.174347 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="afa996035c390e843d3ac43877d07ca9d1101a29ae0b5aec9d46cff4bf09a471" exitCode=0 Apr 19 12:10:13.174831 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:13.174417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"afa996035c390e843d3ac43877d07ca9d1101a29ae0b5aec9d46cff4bf09a471"} Apr 19 12:10:13.864256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:13.864224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:13.864602 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:13.864346 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xmmjm" podUID="571bc17e-6675-462f-9093-2c3531edf595" Apr 19 12:10:14.514456 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.514241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:14.514772 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.514377 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:14.514772 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.514546 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:10:46.514529029 +0000 UTC m=+65.233353175 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:14.584919 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.584890 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-225.ec2.internal" event="NodeReady" Apr 19 12:10:14.585040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.585025 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:10:14.615484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.615450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:14.615720 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.615686 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:10:14.615720 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.615711 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:10:14.615720 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.615723 2568 projected.go:194] Error preparing data for projected volume kube-api-access-7z9jb for pod openshift-network-diagnostics/network-check-target-xmmjm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:14.615915 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.615769 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb podName:571bc17e-6675-462f-9093-2c3531edf595 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:46.615755137 +0000 UTC m=+65.334579271 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7z9jb" (UniqueName: "kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb") pod "network-check-target-xmmjm" (UID: "571bc17e-6675-462f-9093-2c3531edf595") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:14.616334 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.616317 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:10:14.619218 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.619203 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.622151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.621965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rh992\"" Apr 19 12:10:14.622151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.621977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:10:14.622151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.621998 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:10:14.622350 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.622148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:10:14.627562 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.627529 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9xxkb"] Apr 19 12:10:14.629211 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.629192 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:10:14.631093 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.631076 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.631647 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.631502 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:10:14.633527 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.633497 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:10:14.633608 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.633538 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:10:14.633608 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.633567 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:10:14.634674 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.634654 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:10:14.634885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.634867 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9xxkb"] Apr 19 12:10:14.716183 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716092 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.716183 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716183 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716143 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brm6c\" (UniqueName: \"kubernetes.io/projected/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-kube-api-access-brm6c\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716229 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716293 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716310 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjctz\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.716566 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.716385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.729266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.729241 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kzhlq"] Apr 19 12:10:14.749954 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.749926 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kzhlq"] Apr 19 12:10:14.750059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.749979 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.752347 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.752330 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:10:14.752509 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.752490 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:10:14.752509 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.752503 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:10:14.817658 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817670 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjctz\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.817749 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.817770 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:14.817775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817758 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v295p\" (UniqueName: \"kubernetes.io/projected/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-kube-api-access-v295p\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.818060 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.817831 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:15.317812328 +0000 UTC m=+34.036636473 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:14.818060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.818060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.817987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-tmp-dir\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818094 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818111 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818136 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brm6c\" (UniqueName: \"kubernetes.io/projected/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-kube-api-access-brm6c\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.818170 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-config-volume\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.818302 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.818218 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:15.318200785 +0000 UTC m=+34.037024918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:14.818640 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.818739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.818960 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.818932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.823726 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.823700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.823856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.823726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.826355 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.826311 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.826835 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.826812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjctz\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:14.826972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.826952 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brm6c\" (UniqueName: \"kubernetes.io/projected/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-kube-api-access-brm6c\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:14.864670 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.864649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:14.864771 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.864673 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:14.867066 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.867046 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:10:14.867173 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.867082 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:10:14.867233 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.867216 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:10:14.919557 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.919538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-tmp-dir\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.919701 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.919582 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-config-volume\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.919701 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.919606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.919807 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.919730 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v295p\" (UniqueName: \"kubernetes.io/projected/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-kube-api-access-v295p\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.919860 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.919821 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:14.919912 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:14.919883 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:15.419862644 +0000 UTC m=+34.138686777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:14.919979 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.919910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-tmp-dir\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.920266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.920249 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-config-volume\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:14.930352 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:14.930329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v295p\" (UniqueName: \"kubernetes.io/projected/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-kube-api-access-v295p\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:15.323150 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.323108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:15.323321 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.323250 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:15.323321 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.323307 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:16.323293079 +0000 UTC m=+35.042117214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:15.323321 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.323318 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:15.323472 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.323256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:15.323472 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.323328 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:15.323472 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.323450 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:16.3234359 +0000 UTC m=+35.042260037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:15.424178 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.424127 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:15.424379 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.424305 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:15.424439 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:15.424380 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:16.424361183 +0000 UTC m=+35.143185333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:15.870038 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.870009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:15.872651 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.872616 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:10:15.873478 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.873461 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-d9pnm\"" Apr 19 12:10:15.873540 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:15.873462 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:10:16.330576 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:16.330542 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:16.330774 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:16.330619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:16.330774 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.330743 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:16.330774 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.330756 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:16.330774 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.330765 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:16.330961 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.330825 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.330806644 +0000 UTC m=+37.049630793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:16.330961 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.330845 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.330836414 +0000 UTC m=+37.049660549 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:16.430982 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:16.430951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:16.431158 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.431135 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:16.431227 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:16.431215 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.431193562 +0000 UTC m=+37.150017708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:18.343635 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.343587 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.343680 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.343766 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.343764 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.343843 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:22.343823634 +0000 UTC m=+41.062647767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.343777 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:18.344051 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.343897 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:22.343886426 +0000 UTC m=+41.062710559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:18.444545 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.444504 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:18.444748 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.444671 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:18.444748 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:18.444745 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:22.444729694 +0000 UTC m=+41.163553828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:18.545132 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.545049 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:18.547941 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.547913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/64f68a15-86a3-4526-a8c3-2d66d94b763f-original-pull-secret\") pod \"global-pull-secret-syncer-jlktl\" (UID: \"64f68a15-86a3-4526-a8c3-2d66d94b763f\") " pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:18.774868 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:18.774827 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jlktl" Apr 19 12:10:20.699864 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:20.699834 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jlktl"] Apr 19 12:10:20.703299 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:10:20.703273 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f68a15_86a3_4526_a8c3_2d66d94b763f.slice/crio-5dd3a835fe41735b83405e942720c5eaf7aac60060d84c8fa377c6f1312872f9 WatchSource:0}: Error finding container 5dd3a835fe41735b83405e942720c5eaf7aac60060d84c8fa377c6f1312872f9: Status 404 returned error can't find the container with id 5dd3a835fe41735b83405e942720c5eaf7aac60060d84c8fa377c6f1312872f9 Apr 19 12:10:21.191583 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:21.191370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerStarted","Data":"da954becc64a1772617db242efdb7237ac128b5d82d67627111803d7766feca3"} Apr 19 12:10:21.192680 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:21.192654 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jlktl" event={"ID":"64f68a15-86a3-4526-a8c3-2d66d94b763f","Type":"ContainerStarted","Data":"5dd3a835fe41735b83405e942720c5eaf7aac60060d84c8fa377c6f1312872f9"} Apr 19 12:10:22.196521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:22.196488 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="da954becc64a1772617db242efdb7237ac128b5d82d67627111803d7766feca3" exitCode=0 Apr 19 12:10:22.197006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:22.196548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"da954becc64a1772617db242efdb7237ac128b5d82d67627111803d7766feca3"} Apr 19 12:10:22.373676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:22.373639 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:22.373822 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.373736 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:22.373822 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.373797 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:22.373822 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.373812 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:22.373822 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.373798 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:30.373780244 +0000 UTC m=+49.092604395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:22.374022 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:22.373737 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:22.374022 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.373844 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:30.373834438 +0000 UTC m=+49.092658586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:22.475186 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:22.475100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:22.475309 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.475269 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:22.475362 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:22.475334 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:30.475317729 +0000 UTC m=+49.194141863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:23.201431 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:23.201400 2568 generic.go:358] "Generic (PLEG): container finished" podID="688acc9f-4a93-448c-a106-915356989bff" containerID="0739db4ce7631debaf5845157bc81a8b0f71c98130639486a0da8b81070bca7b" exitCode=0 Apr 19 12:10:23.202010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:23.201485 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerDied","Data":"0739db4ce7631debaf5845157bc81a8b0f71c98130639486a0da8b81070bca7b"} Apr 19 12:10:24.208367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:24.208331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" event={"ID":"688acc9f-4a93-448c-a106-915356989bff","Type":"ContainerStarted","Data":"0b1dd1b1fd5d363309d4c0d421da7d6ff89e67dcfefaa43824a63137d649be3a"} Apr 19 12:10:24.230656 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:24.230586 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rqgk2" podStartSLOduration=4.413610747 podStartE2EDuration="42.230570102s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:09:43.09531105 +0000 UTC m=+1.814135182" lastFinishedPulling="2026-04-19 12:10:20.9122704 +0000 UTC m=+39.631094537" observedRunningTime="2026-04-19 12:10:24.229327899 +0000 UTC m=+42.948152066" watchObservedRunningTime="2026-04-19 12:10:24.230570102 +0000 UTC m=+42.949394257" Apr 19 12:10:27.216492 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:27.216285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jlktl" event={"ID":"64f68a15-86a3-4526-a8c3-2d66d94b763f","Type":"ContainerStarted","Data":"ccce66096ac1e7740a56bc1766e79e65ce5e6774c0120d0c8d830683e94b0ca9"} Apr 19 12:10:27.233538 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:27.233497 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jlktl" podStartSLOduration=35.737276912 podStartE2EDuration="41.233483669s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:10:20.705194843 +0000 UTC m=+39.424018976" lastFinishedPulling="2026-04-19 12:10:26.201401598 +0000 UTC m=+44.920225733" observedRunningTime="2026-04-19 12:10:27.232514844 +0000 UTC m=+45.951338996" watchObservedRunningTime="2026-04-19 12:10:27.233483669 +0000 UTC m=+45.952307823" Apr 19 12:10:30.438055 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:30.438023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:30.438081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.438163 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.438174 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.438176 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.438230 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:10:46.438215131 +0000 UTC m=+65.157039264 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:30.438447 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.438242 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:46.438236291 +0000 UTC m=+65.157060423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:30.538743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:30.538712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:30.538904 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.538861 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:30.541668 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:30.539261 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:46.53890764 +0000 UTC m=+65.257731774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:41.176113 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:41.176082 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7t4b4" Apr 19 12:10:46.457452 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.457406 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.457483 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.457562 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.457571 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.457583 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.457614 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:11:18.457601793 +0000 UTC m=+97.176425926 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:10:46.457921 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.457674 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:18.457655789 +0000 UTC m=+97.176479925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:10:46.558232 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.558202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:10:46.558439 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.558268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:10:46.558439 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.558351 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:46.558439 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.558422 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:18.558407152 +0000 UTC m=+97.277231288 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:10:46.560650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.560615 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:10:46.569210 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.569191 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:10:46.569306 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:10:46.569245 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs podName:29686a24-b6da-4655-8af2-679ab3a6bbbf nodeName:}" failed. No retries permitted until 2026-04-19 12:11:50.569229233 +0000 UTC m=+129.288053367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs") pod "network-metrics-daemon-7vkmz" (UID: "29686a24-b6da-4655-8af2-679ab3a6bbbf") : secret "metrics-daemon-secret" not found Apr 19 12:10:46.659213 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.659183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:46.661500 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.661484 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:10:46.671978 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.671962 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:10:46.684471 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.684445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z9jb\" (UniqueName: \"kubernetes.io/projected/571bc17e-6675-462f-9093-2c3531edf595-kube-api-access-7z9jb\") pod \"network-check-target-xmmjm\" (UID: \"571bc17e-6675-462f-9093-2c3531edf595\") " pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:46.781651 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.781554 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-d9pnm\"" Apr 19 12:10:46.789435 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.789413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:46.898925 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:46.898890 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xmmjm"] Apr 19 12:10:46.901709 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:10:46.901682 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod571bc17e_6675_462f_9093_2c3531edf595.slice/crio-4a6effe7648495ecebdcdbc8d0a85a43ea142a4b82715b0010abd301d80fa8d3 WatchSource:0}: Error finding container 4a6effe7648495ecebdcdbc8d0a85a43ea142a4b82715b0010abd301d80fa8d3: Status 404 returned error can't find the container with id 4a6effe7648495ecebdcdbc8d0a85a43ea142a4b82715b0010abd301d80fa8d3 Apr 19 12:10:47.253729 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:47.253692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmmjm" event={"ID":"571bc17e-6675-462f-9093-2c3531edf595","Type":"ContainerStarted","Data":"4a6effe7648495ecebdcdbc8d0a85a43ea142a4b82715b0010abd301d80fa8d3"} Apr 19 12:10:51.263091 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:51.262992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xmmjm" event={"ID":"571bc17e-6675-462f-9093-2c3531edf595","Type":"ContainerStarted","Data":"ebcc33610df26fc844259a5130038c581dcd92fd5695d7cb1dbda6002c1e6ced"} Apr 19 12:10:51.263523 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:51.263114 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:10:51.276068 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:10:51.276029 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xmmjm" podStartSLOduration=65.175952155 podStartE2EDuration="1m9.276016329s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="2026-04-19 12:10:46.903594252 +0000 UTC m=+65.622418390" lastFinishedPulling="2026-04-19 12:10:51.003658429 +0000 UTC m=+69.722482564" observedRunningTime="2026-04-19 12:10:51.275511566 +0000 UTC m=+69.994335721" watchObservedRunningTime="2026-04-19 12:10:51.276016329 +0000 UTC m=+69.994840462" Apr 19 12:11:18.483612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:18.483508 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.483683 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.483710 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5f5598bc7f-hrzcz: secret "image-registry-tls" not found Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.483748 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:18.483681 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.483782 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls podName:0e0d7319-d55e-4c55-b262-2e01232a5a5c nodeName:}" failed. No retries permitted until 2026-04-19 12:12:22.48376196 +0000 UTC m=+161.202586114 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls") pod "image-registry-5f5598bc7f-hrzcz" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c") : secret "image-registry-tls" not found Apr 19 12:11:18.484072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.483825 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert podName:15abc5b3-a4e0-41a2-b57d-ee187b37cd52 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:22.483805188 +0000 UTC m=+161.202629325 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert") pod "ingress-canary-9xxkb" (UID: "15abc5b3-a4e0-41a2-b57d-ee187b37cd52") : secret "canary-serving-cert" not found Apr 19 12:11:18.584825 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:18.584788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:11:18.584996 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.584895 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:11:18.584996 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:18.584942 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls podName:9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:22.584929233 +0000 UTC m=+161.303753366 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls") pod "dns-default-kzhlq" (UID: "9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2") : secret "dns-default-metrics-tls" not found Apr 19 12:11:20.375572 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.375542 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-57d6b784d-fjqwn"] Apr 19 12:11:20.380028 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.380013 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.382354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382059 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-85dv2\"" Apr 19 12:11:20.382354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382270 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 19 12:11:20.384072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382680 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 19 12:11:20.384072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382806 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:11:20.384072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382884 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:11:20.384072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.382908 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 19 12:11:20.385508 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.385486 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57d6b784d-fjqwn"] Apr 19 12:11:20.385981 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.385964 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 19 12:11:20.475786 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.475759 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr"] Apr 19 12:11:20.478572 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.478556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" Apr 19 12:11:20.480779 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.480749 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 19 12:11:20.480779 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.480770 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:20.480942 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.480794 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fmkf9\"" Apr 19 12:11:20.485852 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.485831 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr"] Apr 19 12:11:20.497352 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.497328 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-stats-auth\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.497533 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.497516 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.497731 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.497717 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-default-certificate\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.497872 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.497856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42xm\" (UniqueName: \"kubernetes.io/projected/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-kube-api-access-s42xm\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.498037 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.498023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.598763 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.598763 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqbbx\" (UniqueName: \"kubernetes.io/projected/14272c98-366d-4c1a-a78d-04018f274961-kube-api-access-fqbbx\") pod \"volume-data-source-validator-7c6cbb6c87-pqspr\" (UID: \"14272c98-366d-4c1a-a78d-04018f274961\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" Apr 19 12:11:20.598949 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-stats-auth\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.598949 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598860 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.598949 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:20.598884 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:20.598949 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:20.598949 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.098933425 +0000 UTC m=+99.817757559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:20.599136 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598967 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-default-certificate\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.599136 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.598992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s42xm\" (UniqueName: \"kubernetes.io/projected/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-kube-api-access-s42xm\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.599136 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:20.599032 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.099016269 +0000 UTC m=+99.817840435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:20.601247 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.601226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-stats-auth\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.601247 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.601246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-default-certificate\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.608363 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.608338 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42xm\" (UniqueName: \"kubernetes.io/projected/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-kube-api-access-s42xm\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:20.675541 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.675470 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4"] Apr 19 12:11:20.678680 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.678662 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:20.680876 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.680856 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:20.680876 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.680867 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 19 12:11:20.681027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.680896 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 19 12:11:20.681027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.680924 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gbbd5\"" Apr 19 12:11:20.687188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.687164 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4"] Apr 19 12:11:20.700246 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.700227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqbbx\" (UniqueName: \"kubernetes.io/projected/14272c98-366d-4c1a-a78d-04018f274961-kube-api-access-fqbbx\") pod \"volume-data-source-validator-7c6cbb6c87-pqspr\" (UID: \"14272c98-366d-4c1a-a78d-04018f274961\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" Apr 19 12:11:20.706986 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.706968 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqbbx\" (UniqueName: \"kubernetes.io/projected/14272c98-366d-4c1a-a78d-04018f274961-kube-api-access-fqbbx\") pod \"volume-data-source-validator-7c6cbb6c87-pqspr\" (UID: \"14272c98-366d-4c1a-a78d-04018f274961\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" Apr 19 12:11:20.787905 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.787879 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" Apr 19 12:11:20.800892 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.800863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:20.800980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.800908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpq6z\" (UniqueName: \"kubernetes.io/projected/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-kube-api-access-bpq6z\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:20.892164 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.892137 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr"] Apr 19 12:11:20.895457 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:20.895435 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14272c98_366d_4c1a_a78d_04018f274961.slice/crio-1f5bf52e4b18f0ca3cde4cfd25a08b1e19460dd0b127ada86434025a59f20b05 WatchSource:0}: Error finding container 1f5bf52e4b18f0ca3cde4cfd25a08b1e19460dd0b127ada86434025a59f20b05: Status 404 returned error can't find the container with id 1f5bf52e4b18f0ca3cde4cfd25a08b1e19460dd0b127ada86434025a59f20b05 Apr 19 12:11:20.901296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.901275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:20.901384 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.901306 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpq6z\" (UniqueName: \"kubernetes.io/projected/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-kube-api-access-bpq6z\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:20.901436 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:20.901412 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:20.901519 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:20.901509 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.401487486 +0000 UTC m=+100.120311619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:20.913551 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:20.913533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpq6z\" (UniqueName: \"kubernetes.io/projected/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-kube-api-access-bpq6z\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:21.102369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:21.102331 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:21.102549 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:21.102421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:21.102549 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:21.102518 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:22.102498555 +0000 UTC m=+100.821322693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:21.102549 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:21.102520 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:21.102703 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:21.102558 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:22.102552165 +0000 UTC m=+100.821376297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:21.317119 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:21.317079 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" event={"ID":"14272c98-366d-4c1a-a78d-04018f274961","Type":"ContainerStarted","Data":"1f5bf52e4b18f0ca3cde4cfd25a08b1e19460dd0b127ada86434025a59f20b05"} Apr 19 12:11:21.405240 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:21.405161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:21.405663 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:21.405279 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:21.405663 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:21.405332 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:22.405318737 +0000 UTC m=+101.124142871 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:22.109488 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.109441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:22.109718 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.109543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:22.109718 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:22.109640 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:22.109843 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:22.109723 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:24.10970224 +0000 UTC m=+102.828526381 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:22.109843 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:22.109751 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:24.109739829 +0000 UTC m=+102.828563964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:22.266749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.266726 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xmmjm" Apr 19 12:11:22.320019 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.319990 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" event={"ID":"14272c98-366d-4c1a-a78d-04018f274961","Type":"ContainerStarted","Data":"c01c3a627f7a356d9af9e7282f4285a4ce29de45f1adec7598c23db677204d46"} Apr 19 12:11:22.333117 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.333065 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pqspr" podStartSLOduration=0.965484265 podStartE2EDuration="2.333049942s" podCreationTimestamp="2026-04-19 12:11:20 +0000 UTC" firstStartedPulling="2026-04-19 12:11:20.897255657 +0000 UTC m=+99.616079790" lastFinishedPulling="2026-04-19 12:11:22.264821319 +0000 UTC m=+100.983645467" observedRunningTime="2026-04-19 12:11:22.332678117 +0000 UTC m=+101.051502271" watchObservedRunningTime="2026-04-19 12:11:22.333049942 +0000 UTC m=+101.051874076" Apr 19 12:11:22.412295 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:22.412228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:22.412691 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:22.412366 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:22.412691 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:22.412420 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:24.412405324 +0000 UTC m=+103.131229457 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:24.125613 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:24.125568 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:24.126152 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:24.125660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:24.126152 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:24.125726 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:24.126152 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:24.125784 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:28.12576994 +0000 UTC m=+106.844594076 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:24.126152 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:24.125832 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:28.125812693 +0000 UTC m=+106.844636844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:24.427672 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:24.427564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:24.427821 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:24.427716 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:24.427821 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:24.427778 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:28.427764337 +0000 UTC m=+107.146588469 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:26.159316 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:26.159288 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h8wxf_9920f4d8-e6b0-4993-baa5-e254915bebae/dns-node-resolver/0.log" Apr 19 12:11:26.959230 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:26.959206 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t4c4r_4e45a4de-472d-4b7e-addc-01dfec69c9d8/node-ca/0.log" Apr 19 12:11:27.409283 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.409214 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf"] Apr 19 12:11:27.412160 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.412145 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.414498 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.414477 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 19 12:11:27.414663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.414594 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 19 12:11:27.414663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.414602 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 19 12:11:27.414663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.414604 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:11:27.414805 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.414752 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zl6n2\"" Apr 19 12:11:27.420707 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.420686 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf"] Apr 19 12:11:27.550835 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.550798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.551008 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.550842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkdvq\" (UniqueName: \"kubernetes.io/projected/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-kube-api-access-lkdvq\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.551008 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.550911 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-config\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.651511 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.651468 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.651700 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.651523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkdvq\" (UniqueName: \"kubernetes.io/projected/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-kube-api-access-lkdvq\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.651700 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.651573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-config\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.652079 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.652059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-config\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.653615 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.653593 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.658998 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.658978 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkdvq\" (UniqueName: \"kubernetes.io/projected/a027b65b-1aeb-4b64-ba23-6687e4bcf69b-kube-api-access-lkdvq\") pod \"service-ca-operator-d6fc45fc5-6k6qf\" (UID: \"a027b65b-1aeb-4b64-ba23-6687e4bcf69b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.721193 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.721166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" Apr 19 12:11:27.829329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:27.829308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf"] Apr 19 12:11:27.831786 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:27.831757 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda027b65b_1aeb_4b64_ba23_6687e4bcf69b.slice/crio-6e4fba2426e02cda4fb101639e5b5b90958abb0b9e591e38918838e24df5529b WatchSource:0}: Error finding container 6e4fba2426e02cda4fb101639e5b5b90958abb0b9e591e38918838e24df5529b: Status 404 returned error can't find the container with id 6e4fba2426e02cda4fb101639e5b5b90958abb0b9e591e38918838e24df5529b Apr 19 12:11:28.155194 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.155125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:28.155194 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.155175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:28.155370 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:28.155278 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:28.155370 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:28.155317 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:36.155298747 +0000 UTC m=+114.874122894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:28.155370 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:28.155335 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:36.155327658 +0000 UTC m=+114.874151793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:28.244383 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.244353 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p"] Apr 19 12:11:28.248443 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.248426 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" Apr 19 12:11:28.250404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.250385 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4569n\"" Apr 19 12:11:28.253379 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.253361 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p"] Apr 19 12:11:28.331617 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.331590 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" event={"ID":"a027b65b-1aeb-4b64-ba23-6687e4bcf69b","Type":"ContainerStarted","Data":"6e4fba2426e02cda4fb101639e5b5b90958abb0b9e591e38918838e24df5529b"} Apr 19 12:11:28.356965 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.356942 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfh5\" (UniqueName: \"kubernetes.io/projected/46d7fd4e-5950-4a4e-b4bb-712e0db4a633-kube-api-access-pbfh5\") pod \"network-check-source-8894fc9bd-8dx5p\" (UID: \"46d7fd4e-5950-4a4e-b4bb-712e0db4a633\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" Apr 19 12:11:28.457668 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.457641 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:28.457952 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.457683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfh5\" (UniqueName: \"kubernetes.io/projected/46d7fd4e-5950-4a4e-b4bb-712e0db4a633-kube-api-access-pbfh5\") pod \"network-check-source-8894fc9bd-8dx5p\" (UID: \"46d7fd4e-5950-4a4e-b4bb-712e0db4a633\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" Apr 19 12:11:28.457952 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:28.457715 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:28.457952 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:28.457771 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:36.457754234 +0000 UTC m=+115.176578371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:28.464962 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.464945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfh5\" (UniqueName: \"kubernetes.io/projected/46d7fd4e-5950-4a4e-b4bb-712e0db4a633-kube-api-access-pbfh5\") pod \"network-check-source-8894fc9bd-8dx5p\" (UID: \"46d7fd4e-5950-4a4e-b4bb-712e0db4a633\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" Apr 19 12:11:28.557652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.557616 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" Apr 19 12:11:28.683344 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:28.683314 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p"] Apr 19 12:11:28.685701 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:28.685671 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d7fd4e_5950_4a4e_b4bb_712e0db4a633.slice/crio-a3d9d2b1266d124fb246639356513ea740ee4b9ac03ca6dc050362c12fa2161d WatchSource:0}: Error finding container a3d9d2b1266d124fb246639356513ea740ee4b9ac03ca6dc050362c12fa2161d: Status 404 returned error can't find the container with id a3d9d2b1266d124fb246639356513ea740ee4b9ac03ca6dc050362c12fa2161d Apr 19 12:11:29.334659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:29.334613 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" event={"ID":"46d7fd4e-5950-4a4e-b4bb-712e0db4a633","Type":"ContainerStarted","Data":"6b8e4aa67e6129bb0f01bc73c222b78fc0921c765a8b63b31fc467f30230a1ce"} Apr 19 12:11:29.334659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:29.334658 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" event={"ID":"46d7fd4e-5950-4a4e-b4bb-712e0db4a633","Type":"ContainerStarted","Data":"a3d9d2b1266d124fb246639356513ea740ee4b9ac03ca6dc050362c12fa2161d"} Apr 19 12:11:29.347743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:29.347695 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8dx5p" podStartSLOduration=1.347677733 podStartE2EDuration="1.347677733s" podCreationTimestamp="2026-04-19 12:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:11:29.347270021 +0000 UTC m=+108.066094179" watchObservedRunningTime="2026-04-19 12:11:29.347677733 +0000 UTC m=+108.066501886" Apr 19 12:11:31.339716 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:31.339683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" event={"ID":"a027b65b-1aeb-4b64-ba23-6687e4bcf69b","Type":"ContainerStarted","Data":"967c885d29b7460550c40eba1c9717cefcf72131b117eb7e9ddd6c106709dfb6"} Apr 19 12:11:31.353819 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:31.353778 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" podStartSLOduration=1.895890648 podStartE2EDuration="4.353762302s" podCreationTimestamp="2026-04-19 12:11:27 +0000 UTC" firstStartedPulling="2026-04-19 12:11:27.833885101 +0000 UTC m=+106.552709238" lastFinishedPulling="2026-04-19 12:11:30.291756745 +0000 UTC m=+109.010580892" observedRunningTime="2026-04-19 12:11:31.353180417 +0000 UTC m=+110.072004573" watchObservedRunningTime="2026-04-19 12:11:31.353762302 +0000 UTC m=+110.072586458" Apr 19 12:11:36.216611 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:36.216560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:36.217072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:36.216650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:36.217072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:36.216717 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 19 12:11:36.217072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:36.216752 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:52.21673999 +0000 UTC m=+130.935564122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : configmap references non-existent config key: service-ca.crt Apr 19 12:11:36.217072 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:36.216771 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs podName:62ce91ba-1d6b-4aae-9fc2-6ef69f90a963 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:52.216759453 +0000 UTC m=+130.935583586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs") pod "router-default-57d6b784d-fjqwn" (UID: "62ce91ba-1d6b-4aae-9fc2-6ef69f90a963") : secret "router-metrics-certs-default" not found Apr 19 12:11:36.519608 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:36.519524 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:36.519776 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:36.519705 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 19 12:11:36.519776 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:36.519770 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls podName:cc293f1b-2bae-4697-8e9f-839cf4b13f8b nodeName:}" failed. No retries permitted until 2026-04-19 12:11:52.519756272 +0000 UTC m=+131.238580405 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-w6tr4" (UID: "cc293f1b-2bae-4697-8e9f-839cf4b13f8b") : secret "samples-operator-tls" not found Apr 19 12:11:50.625428 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:50.625376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:11:50.627755 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:50.627731 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29686a24-b6da-4655-8af2-679ab3a6bbbf-metrics-certs\") pod \"network-metrics-daemon-7vkmz\" (UID: \"29686a24-b6da-4655-8af2-679ab3a6bbbf\") " pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:11:50.882722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:50.882644 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:11:50.891425 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:50.891405 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vkmz" Apr 19 12:11:50.999102 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:50.999074 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vkmz"] Apr 19 12:11:51.002198 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:51.002172 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29686a24_b6da_4655_8af2_679ab3a6bbbf.slice/crio-2be93114345f25a0878356464574766c14bd258bcc6fc920ec87e48e87973766 WatchSource:0}: Error finding container 2be93114345f25a0878356464574766c14bd258bcc6fc920ec87e48e87973766: Status 404 returned error can't find the container with id 2be93114345f25a0878356464574766c14bd258bcc6fc920ec87e48e87973766 Apr 19 12:11:51.379202 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:51.379161 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vkmz" event={"ID":"29686a24-b6da-4655-8af2-679ab3a6bbbf","Type":"ContainerStarted","Data":"2be93114345f25a0878356464574766c14bd258bcc6fc920ec87e48e87973766"} Apr 19 12:11:52.235722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.235691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:52.236099 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.235751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:52.236396 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.236376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-service-ca-bundle\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:52.238216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.238190 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62ce91ba-1d6b-4aae-9fc2-6ef69f90a963-metrics-certs\") pod \"router-default-57d6b784d-fjqwn\" (UID: \"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963\") " pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:52.493041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.492987 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-85dv2\"" Apr 19 12:11:52.500957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.500911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:52.538439 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.538411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:52.541085 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.541062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cc293f1b-2bae-4697-8e9f-839cf4b13f8b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-w6tr4\" (UID: \"cc293f1b-2bae-4697-8e9f-839cf4b13f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:52.651686 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.651539 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-57d6b784d-fjqwn"] Apr 19 12:11:52.656749 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:52.656717 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62ce91ba_1d6b_4aae_9fc2_6ef69f90a963.slice/crio-93c781360260e7eaf016f44bdb80651003f5e073a669c97b2ab3094f5b8bb1cd WatchSource:0}: Error finding container 93c781360260e7eaf016f44bdb80651003f5e073a669c97b2ab3094f5b8bb1cd: Status 404 returned error can't find the container with id 93c781360260e7eaf016f44bdb80651003f5e073a669c97b2ab3094f5b8bb1cd Apr 19 12:11:52.789786 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.789714 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gbbd5\"" Apr 19 12:11:52.798517 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.798497 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" Apr 19 12:11:52.906559 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:52.906528 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4"] Apr 19 12:11:53.385842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.385808 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" event={"ID":"cc293f1b-2bae-4697-8e9f-839cf4b13f8b","Type":"ContainerStarted","Data":"15334a20eec982de80ca138ad0ce148cafbc8d91fbe1f41ff880d506f5dc8bc6"} Apr 19 12:11:53.387295 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.387265 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vkmz" event={"ID":"29686a24-b6da-4655-8af2-679ab3a6bbbf","Type":"ContainerStarted","Data":"ea510a8bab6fa9df76fda73afbd0c12c0746ff63bbc7f3e8826d54099f74a1d2"} Apr 19 12:11:53.387399 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.387301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vkmz" event={"ID":"29686a24-b6da-4655-8af2-679ab3a6bbbf","Type":"ContainerStarted","Data":"8ed7553862761d8999ed5e4f26c6570a20b513bffa5b509ce9e54842952f6776"} Apr 19 12:11:53.388469 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.388444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57d6b784d-fjqwn" event={"ID":"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963","Type":"ContainerStarted","Data":"307bd7c08da58662b7ba5b0bf57b2d34ce7553428593d107ca54b2ac8db6b5f1"} Apr 19 12:11:53.388540 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.388477 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-57d6b784d-fjqwn" event={"ID":"62ce91ba-1d6b-4aae-9fc2-6ef69f90a963","Type":"ContainerStarted","Data":"93c781360260e7eaf016f44bdb80651003f5e073a669c97b2ab3094f5b8bb1cd"} Apr 19 12:11:53.401664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.401603 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7vkmz" podStartSLOduration=130.948774812 podStartE2EDuration="2m12.401589969s" podCreationTimestamp="2026-04-19 12:09:41 +0000 UTC" firstStartedPulling="2026-04-19 12:11:51.004086576 +0000 UTC m=+129.722910723" lastFinishedPulling="2026-04-19 12:11:52.456901747 +0000 UTC m=+131.175725880" observedRunningTime="2026-04-19 12:11:53.401014631 +0000 UTC m=+132.119838791" watchObservedRunningTime="2026-04-19 12:11:53.401589969 +0000 UTC m=+132.120414124" Apr 19 12:11:53.416511 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.416433 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-57d6b784d-fjqwn" podStartSLOduration=33.416419147 podStartE2EDuration="33.416419147s" podCreationTimestamp="2026-04-19 12:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:11:53.416217614 +0000 UTC m=+132.135041770" watchObservedRunningTime="2026-04-19 12:11:53.416419147 +0000 UTC m=+132.135243304" Apr 19 12:11:53.501754 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.501720 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:53.504531 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:53.504505 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:54.391228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.391200 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:54.392265 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.392247 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-57d6b784d-fjqwn" Apr 19 12:11:54.554539 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.554508 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tth9c"] Apr 19 12:11:54.558167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.558146 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.560788 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.560747 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:11:54.560880 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.560796 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k9qgl\"" Apr 19 12:11:54.561036 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.561023 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:11:54.561273 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.561257 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:11:54.561371 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.561353 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:11:54.567886 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.567864 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tth9c"] Apr 19 12:11:54.655041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.654952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.655041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.654989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/495fbf4b-c9eb-42e3-8537-ae405b10f613-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.655041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.655012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kd4r\" (UniqueName: \"kubernetes.io/projected/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-api-access-8kd4r\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.655266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.655087 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/495fbf4b-c9eb-42e3-8537-ae405b10f613-crio-socket\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.655266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.655131 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/495fbf4b-c9eb-42e3-8537-ae405b10f613-data-volume\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755487 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/495fbf4b-c9eb-42e3-8537-ae405b10f613-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755487 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kd4r\" (UniqueName: \"kubernetes.io/projected/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-api-access-8kd4r\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755752 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/495fbf4b-c9eb-42e3-8537-ae405b10f613-crio-socket\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755752 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755527 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/495fbf4b-c9eb-42e3-8537-ae405b10f613-data-volume\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755752 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/495fbf4b-c9eb-42e3-8537-ae405b10f613-crio-socket\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755877 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/495fbf4b-c9eb-42e3-8537-ae405b10f613-data-volume\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.755877 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.755839 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.756307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.756284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.757945 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.757920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/495fbf4b-c9eb-42e3-8537-ae405b10f613-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.766407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.766376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kd4r\" (UniqueName: \"kubernetes.io/projected/495fbf4b-c9eb-42e3-8537-ae405b10f613-kube-api-access-8kd4r\") pod \"insights-runtime-extractor-tth9c\" (UID: \"495fbf4b-c9eb-42e3-8537-ae405b10f613\") " pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:54.867407 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:54.867370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tth9c" Apr 19 12:11:55.294502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.294422 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tth9c"] Apr 19 12:11:55.296768 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:55.296741 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495fbf4b_c9eb_42e3_8537_ae405b10f613.slice/crio-70480e5809ff7751edf445f5ad399fc0a1fcd2f988f82cb285b6f9fa87878539 WatchSource:0}: Error finding container 70480e5809ff7751edf445f5ad399fc0a1fcd2f988f82cb285b6f9fa87878539: Status 404 returned error can't find the container with id 70480e5809ff7751edf445f5ad399fc0a1fcd2f988f82cb285b6f9fa87878539 Apr 19 12:11:55.395056 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.395012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" event={"ID":"cc293f1b-2bae-4697-8e9f-839cf4b13f8b","Type":"ContainerStarted","Data":"0c40ece9c3c23404032eae0cc28ca6605604dca880eb5eadc6983bfec0993901"} Apr 19 12:11:55.395056 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.395045 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" event={"ID":"cc293f1b-2bae-4697-8e9f-839cf4b13f8b","Type":"ContainerStarted","Data":"8d0e122812099c534f0fdf916709f4fa9b85af629bed6478b8521d602fc4a8d9"} Apr 19 12:11:55.396567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.396546 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tth9c" event={"ID":"495fbf4b-c9eb-42e3-8537-ae405b10f613","Type":"ContainerStarted","Data":"04ab562bdd63d78fe47e10d3816e7d519743262f9bc9bef21276e191bd10f7aa"} Apr 19 12:11:55.396653 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.396573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tth9c" event={"ID":"495fbf4b-c9eb-42e3-8537-ae405b10f613","Type":"ContainerStarted","Data":"70480e5809ff7751edf445f5ad399fc0a1fcd2f988f82cb285b6f9fa87878539"} Apr 19 12:11:55.410721 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.410682 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-w6tr4" podStartSLOduration=33.127059529 podStartE2EDuration="35.410667373s" podCreationTimestamp="2026-04-19 12:11:20 +0000 UTC" firstStartedPulling="2026-04-19 12:11:52.948740654 +0000 UTC m=+131.667564798" lastFinishedPulling="2026-04-19 12:11:55.232348504 +0000 UTC m=+133.951172642" observedRunningTime="2026-04-19 12:11:55.410314657 +0000 UTC m=+134.129138813" watchObservedRunningTime="2026-04-19 12:11:55.410667373 +0000 UTC m=+134.129491528" Apr 19 12:11:55.548928 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.548857 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd"] Apr 19 12:11:55.551921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.551903 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:55.556131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.556110 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 19 12:11:55.556131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.556121 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-x6jn2\"" Apr 19 12:11:55.561183 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.561159 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd"] Apr 19 12:11:55.665099 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.665056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5349ed6d-296f-49e9-8001-1cb71a7a2a71-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9rcpd\" (UID: \"5349ed6d-296f-49e9-8001-1cb71a7a2a71\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:55.766496 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.766451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5349ed6d-296f-49e9-8001-1cb71a7a2a71-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9rcpd\" (UID: \"5349ed6d-296f-49e9-8001-1cb71a7a2a71\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:55.769393 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.769371 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/5349ed6d-296f-49e9-8001-1cb71a7a2a71-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9rcpd\" (UID: \"5349ed6d-296f-49e9-8001-1cb71a7a2a71\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:55.864428 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.864401 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:55.976899 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:55.976834 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd"] Apr 19 12:11:55.980789 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:55.980762 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5349ed6d_296f_49e9_8001_1cb71a7a2a71.slice/crio-08926ba11823244986f6e44547cd93a6d0a389ee2237ad28d332731e28c1d39d WatchSource:0}: Error finding container 08926ba11823244986f6e44547cd93a6d0a389ee2237ad28d332731e28c1d39d: Status 404 returned error can't find the container with id 08926ba11823244986f6e44547cd93a6d0a389ee2237ad28d332731e28c1d39d Apr 19 12:11:56.401046 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:56.401012 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tth9c" event={"ID":"495fbf4b-c9eb-42e3-8537-ae405b10f613","Type":"ContainerStarted","Data":"91ef3f942b069605905e2d83a54950b5ecbfa1dc0ce4a807a75e83cec11ebfd4"} Apr 19 12:11:56.402163 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:56.402135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" event={"ID":"5349ed6d-296f-49e9-8001-1cb71a7a2a71","Type":"ContainerStarted","Data":"08926ba11823244986f6e44547cd93a6d0a389ee2237ad28d332731e28c1d39d"} Apr 19 12:11:57.406338 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.406297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" event={"ID":"5349ed6d-296f-49e9-8001-1cb71a7a2a71","Type":"ContainerStarted","Data":"e775bb50c38ab132d095a7b35335f2782159d853c7eb85f798246f5893f9b74e"} Apr 19 12:11:57.406793 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.406513 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:57.411808 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.411780 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" Apr 19 12:11:57.421546 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.421454 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9rcpd" podStartSLOduration=1.233645946 podStartE2EDuration="2.421409277s" podCreationTimestamp="2026-04-19 12:11:55 +0000 UTC" firstStartedPulling="2026-04-19 12:11:55.982581861 +0000 UTC m=+134.701405997" lastFinishedPulling="2026-04-19 12:11:57.170345196 +0000 UTC m=+135.889169328" observedRunningTime="2026-04-19 12:11:57.420703267 +0000 UTC m=+136.139527425" watchObservedRunningTime="2026-04-19 12:11:57.421409277 +0000 UTC m=+136.140233426" Apr 19 12:11:57.601387 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.601351 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kckkr"] Apr 19 12:11:57.604488 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.604474 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.606799 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.606773 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 19 12:11:57.606799 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.606788 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:11:57.606984 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.606793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 19 12:11:57.606984 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.606816 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:11:57.607057 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.606998 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-cfzqm\"" Apr 19 12:11:57.607156 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.607139 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:11:57.611156 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.611121 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kckkr"] Apr 19 12:11:57.680560 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.680482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.680724 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.680564 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d391a91d-cb33-4e29-a391-e8fd3f91c810-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.680724 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.680673 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.680724 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.680700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p989j\" (UniqueName: \"kubernetes.io/projected/d391a91d-cb33-4e29-a391-e8fd3f91c810-kube-api-access-p989j\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.781648 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.781607 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.781800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.781712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d391a91d-cb33-4e29-a391-e8fd3f91c810-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.781800 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:57.781750 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 19 12:11:57.781800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.781793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.781971 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:11:57.781820 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls podName:d391a91d-cb33-4e29-a391-e8fd3f91c810 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:58.281798779 +0000 UTC m=+137.000622916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-kckkr" (UID: "d391a91d-cb33-4e29-a391-e8fd3f91c810") : secret "prometheus-operator-tls" not found Apr 19 12:11:57.781971 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.781858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p989j\" (UniqueName: \"kubernetes.io/projected/d391a91d-cb33-4e29-a391-e8fd3f91c810-kube-api-access-p989j\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.782753 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.782732 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d391a91d-cb33-4e29-a391-e8fd3f91c810-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.784470 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.784449 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:57.790059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:57.790040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p989j\" (UniqueName: \"kubernetes.io/projected/d391a91d-cb33-4e29-a391-e8fd3f91c810-kube-api-access-p989j\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:58.286464 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.286434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:58.288635 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.288603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d391a91d-cb33-4e29-a391-e8fd3f91c810-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-kckkr\" (UID: \"d391a91d-cb33-4e29-a391-e8fd3f91c810\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:58.415324 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.415296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tth9c" event={"ID":"495fbf4b-c9eb-42e3-8537-ae405b10f613","Type":"ContainerStarted","Data":"98c667e1650dacc9a623b7699abacb43b59638c418ca9e504cd4b44dcb6b6233"} Apr 19 12:11:58.433931 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.433893 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tth9c" podStartSLOduration=1.582658804 podStartE2EDuration="4.433878978s" podCreationTimestamp="2026-04-19 12:11:54 +0000 UTC" firstStartedPulling="2026-04-19 12:11:55.352291642 +0000 UTC m=+134.071115789" lastFinishedPulling="2026-04-19 12:11:58.203511827 +0000 UTC m=+136.922335963" observedRunningTime="2026-04-19 12:11:58.432734875 +0000 UTC m=+137.151559030" watchObservedRunningTime="2026-04-19 12:11:58.433878978 +0000 UTC m=+137.152703135" Apr 19 12:11:58.514981 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.514913 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" Apr 19 12:11:58.623657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:58.623602 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-kckkr"] Apr 19 12:11:58.627423 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:11:58.627400 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd391a91d_cb33_4e29_a391_e8fd3f91c810.slice/crio-706915029b09faadaf679839c9273b8e3bc1f2929bb47f011675ad796197db16 WatchSource:0}: Error finding container 706915029b09faadaf679839c9273b8e3bc1f2929bb47f011675ad796197db16: Status 404 returned error can't find the container with id 706915029b09faadaf679839c9273b8e3bc1f2929bb47f011675ad796197db16 Apr 19 12:11:59.419401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:11:59.419360 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" event={"ID":"d391a91d-cb33-4e29-a391-e8fd3f91c810","Type":"ContainerStarted","Data":"706915029b09faadaf679839c9273b8e3bc1f2929bb47f011675ad796197db16"} Apr 19 12:12:00.423606 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:00.423521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" event={"ID":"d391a91d-cb33-4e29-a391-e8fd3f91c810","Type":"ContainerStarted","Data":"17836ab4376c861c9fe61ac1968e3859ec51e948868194aae5e514c019114b5a"} Apr 19 12:12:00.423606 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:00.423559 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" event={"ID":"d391a91d-cb33-4e29-a391-e8fd3f91c810","Type":"ContainerStarted","Data":"6424cddf6af43a09e6378a66fe8da9df1d5aff755028649df0dd5225feb8dbf4"} Apr 19 12:12:00.439054 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:00.439010 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-kckkr" podStartSLOduration=2.032259831 podStartE2EDuration="3.438997204s" podCreationTimestamp="2026-04-19 12:11:57 +0000 UTC" firstStartedPulling="2026-04-19 12:11:58.629726882 +0000 UTC m=+137.348551028" lastFinishedPulling="2026-04-19 12:12:00.03646425 +0000 UTC m=+138.755288401" observedRunningTime="2026-04-19 12:12:00.437283502 +0000 UTC m=+139.156107660" watchObservedRunningTime="2026-04-19 12:12:00.438997204 +0000 UTC m=+139.157821379" Apr 19 12:12:01.927994 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.927962 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj"] Apr 19 12:12:01.931237 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.931222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:01.933454 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.933428 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 19 12:12:01.933583 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.933458 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 19 12:12:01.933583 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.933507 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ts4mz\"" Apr 19 12:12:01.940311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.940290 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj"] Apr 19 12:12:01.970497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.970475 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p8gp6"] Apr 19 12:12:01.973354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.973341 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:01.975408 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.975391 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lgltw\"" Apr 19 12:12:01.975532 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.975487 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:12:01.975601 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.975502 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:12:01.976064 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:01.976043 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:12:02.015487 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.015585 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015491 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92v9\" (UniqueName: \"kubernetes.io/projected/bea51828-5053-4026-a29e-73d8cc734dcd-kube-api-access-m92v9\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015653 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015594 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015653 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xb4p\" (UniqueName: \"kubernetes.io/projected/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-kube-api-access-4xb4p\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.015766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-tls\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015677 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.015766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-metrics-client-ca\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-root\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-textfile\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.015903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015859 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-sys\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.016013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.016013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.015933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-wtmp\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.116923 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.116891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.116923 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.116926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xb4p\" (UniqueName: \"kubernetes.io/projected/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-kube-api-access-4xb4p\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.117167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.116945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-tls\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.117167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-metrics-client-ca\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117168 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117198 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-root\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-textfile\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117260 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-sys\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-root\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117346 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-sys\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-wtmp\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m92v9\" (UniqueName: \"kubernetes.io/projected/bea51828-5053-4026-a29e-73d8cc734dcd-kube-api-access-m92v9\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-wtmp\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-textfile\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.117864 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:12:02.117662 2568 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 19 12:12:02.117864 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.117681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.117864 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:12:02.117726 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls podName:5725e8df-66f1-4e22-bfed-e6467ea2c2a6 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:02.61770843 +0000 UTC m=+141.336532573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-tq6hj" (UID: "5725e8df-66f1-4e22-bfed-e6467ea2c2a6") : secret "openshift-state-metrics-tls" not found Apr 19 12:12:02.118066 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.118046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.118156 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.118135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bea51828-5053-4026-a29e-73d8cc734dcd-metrics-client-ca\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.119688 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.119663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.119791 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.119702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.119791 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.119733 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bea51828-5053-4026-a29e-73d8cc734dcd-node-exporter-tls\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.126646 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.126606 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92v9\" (UniqueName: \"kubernetes.io/projected/bea51828-5053-4026-a29e-73d8cc734dcd-kube-api-access-m92v9\") pod \"node-exporter-p8gp6\" (UID: \"bea51828-5053-4026-a29e-73d8cc734dcd\") " pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.126738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.126671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xb4p\" (UniqueName: \"kubernetes.io/projected/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-kube-api-access-4xb4p\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.285289 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.285255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8gp6" Apr 19 12:12:02.294605 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:02.294575 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea51828_5053_4026_a29e_73d8cc734dcd.slice/crio-a959496f014eb71afb240468f37db5602a099d551c4d2912f6d7138e8308302c WatchSource:0}: Error finding container a959496f014eb71afb240468f37db5602a099d551c4d2912f6d7138e8308302c: Status 404 returned error can't find the container with id a959496f014eb71afb240468f37db5602a099d551c4d2912f6d7138e8308302c Apr 19 12:12:02.430208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.430175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8gp6" event={"ID":"bea51828-5053-4026-a29e-73d8cc734dcd","Type":"ContainerStarted","Data":"a959496f014eb71afb240468f37db5602a099d551c4d2912f6d7138e8308302c"} Apr 19 12:12:02.620646 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.620528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.622920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.622898 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5725e8df-66f1-4e22-bfed-e6467ea2c2a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-tq6hj\" (UID: \"5725e8df-66f1-4e22-bfed-e6467ea2c2a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.839958 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.839929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" Apr 19 12:12:02.969371 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:02.969204 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj"] Apr 19 12:12:02.972207 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:02.972170 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5725e8df_66f1_4e22_bfed_e6467ea2c2a6.slice/crio-cf9224359afce5c3a0a617e13ec21d4fd88777ae458d5a616e0ac3cea349de0b WatchSource:0}: Error finding container cf9224359afce5c3a0a617e13ec21d4fd88777ae458d5a616e0ac3cea349de0b: Status 404 returned error can't find the container with id cf9224359afce5c3a0a617e13ec21d4fd88777ae458d5a616e0ac3cea349de0b Apr 19 12:12:03.009847 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.009727 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:12:03.015840 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.015363 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.017908 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.017931 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018005 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018155 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018185 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 19 12:12:03.018296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018196 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 19 12:12:03.018687 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018599 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 19 12:12:03.018742 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018703 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 19 12:12:03.019012 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018865 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4k858\"" Apr 19 12:12:03.019012 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.018907 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 19 12:12:03.025657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.025252 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:12:03.125523 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9sm9\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125706 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125706 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125681 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125780 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125754 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125815 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125787 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125847 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.125898 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.125856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.126021 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.126002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.126159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.126050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.126159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.126090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.126159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.126116 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.126159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.126156 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227302 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227452 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.227891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.227606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9sm9\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.228515 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.228164 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.230339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.230311 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.230496 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.230475 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.230595 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.230573 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.230688 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.230608 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.230950 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.230930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.231113 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.231089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.231181 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.231162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.231404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.231383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.231501 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.231469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.231668 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.231651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.232756 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.232736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.235806 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.235789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9sm9\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9\") pod \"alertmanager-main-0\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.340544 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.340523 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:12:03.433691 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.433661 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8gp6" event={"ID":"bea51828-5053-4026-a29e-73d8cc734dcd","Type":"ContainerStarted","Data":"d52705c421252c906a358992412a844efa6923af8926d48e97d28da2f3af755d"} Apr 19 12:12:03.440866 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.440830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" event={"ID":"5725e8df-66f1-4e22-bfed-e6467ea2c2a6","Type":"ContainerStarted","Data":"9cda2dfec9cec237cac66850f049217b8325c9165cb1164c92b204234c6ecff8"} Apr 19 12:12:03.440866 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.440869 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" event={"ID":"5725e8df-66f1-4e22-bfed-e6467ea2c2a6","Type":"ContainerStarted","Data":"90e79734d0105b0674a4a6a43b9a66d341b7fd78a2c9cc7657cdce1f59e7d309"} Apr 19 12:12:03.441030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.440878 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" event={"ID":"5725e8df-66f1-4e22-bfed-e6467ea2c2a6","Type":"ContainerStarted","Data":"cf9224359afce5c3a0a617e13ec21d4fd88777ae458d5a616e0ac3cea349de0b"} Apr 19 12:12:03.464512 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:03.464488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:12:03.466844 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:03.466821 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2841affb_5791_4e66_bbba_2b764530ccc4.slice/crio-3f25ac19e31229d5bed764ee30d2cad63e92dcad07c46013448c80290ea73a79 WatchSource:0}: Error finding container 3f25ac19e31229d5bed764ee30d2cad63e92dcad07c46013448c80290ea73a79: Status 404 returned error can't find the container with id 3f25ac19e31229d5bed764ee30d2cad63e92dcad07c46013448c80290ea73a79 Apr 19 12:12:04.444872 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:04.444840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" event={"ID":"5725e8df-66f1-4e22-bfed-e6467ea2c2a6","Type":"ContainerStarted","Data":"6ab923f1f23d0926c05c4f90028b87b352c68834d8db901c6c6ae6a856d93ff2"} Apr 19 12:12:04.446435 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:04.446407 2568 generic.go:358] "Generic (PLEG): container finished" podID="bea51828-5053-4026-a29e-73d8cc734dcd" containerID="d52705c421252c906a358992412a844efa6923af8926d48e97d28da2f3af755d" exitCode=0 Apr 19 12:12:04.446569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:04.446499 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8gp6" event={"ID":"bea51828-5053-4026-a29e-73d8cc734dcd","Type":"ContainerDied","Data":"d52705c421252c906a358992412a844efa6923af8926d48e97d28da2f3af755d"} Apr 19 12:12:04.447801 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:04.447764 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"3f25ac19e31229d5bed764ee30d2cad63e92dcad07c46013448c80290ea73a79"} Apr 19 12:12:04.463659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:04.463600 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-tq6hj" podStartSLOduration=2.500706902 podStartE2EDuration="3.463584277s" podCreationTimestamp="2026-04-19 12:12:01 +0000 UTC" firstStartedPulling="2026-04-19 12:12:03.113868822 +0000 UTC m=+141.832692955" lastFinishedPulling="2026-04-19 12:12:04.076746183 +0000 UTC m=+142.795570330" observedRunningTime="2026-04-19 12:12:04.46143923 +0000 UTC m=+143.180263410" watchObservedRunningTime="2026-04-19 12:12:04.463584277 +0000 UTC m=+143.182408433" Apr 19 12:12:05.451914 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:05.451880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8gp6" event={"ID":"bea51828-5053-4026-a29e-73d8cc734dcd","Type":"ContainerStarted","Data":"a475756a6fefaf3dd4daecfbfb978563008742ec59595c04e5f0cb46d4b66e85"} Apr 19 12:12:05.452285 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:05.451920 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8gp6" event={"ID":"bea51828-5053-4026-a29e-73d8cc734dcd","Type":"ContainerStarted","Data":"e224713c38cef21d4fe0675eec440040f61d2efc667fdb7ded75d9526c22b4b7"} Apr 19 12:12:05.453169 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:05.453144 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5" exitCode=0 Apr 19 12:12:05.453294 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:05.453226 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5"} Apr 19 12:12:05.469032 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:05.468990 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p8gp6" podStartSLOduration=3.493483312 podStartE2EDuration="4.468977161s" podCreationTimestamp="2026-04-19 12:12:01 +0000 UTC" firstStartedPulling="2026-04-19 12:12:02.296260282 +0000 UTC m=+141.015084430" lastFinishedPulling="2026-04-19 12:12:03.271754143 +0000 UTC m=+141.990578279" observedRunningTime="2026-04-19 12:12:05.467820359 +0000 UTC m=+144.186644514" watchObservedRunningTime="2026-04-19 12:12:05.468977161 +0000 UTC m=+144.187801316" Apr 19 12:12:07.154861 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.154836 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-557f7b95bf-wpn6m"] Apr 19 12:12:07.159252 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.159231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.161743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161570 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 19 12:12:07.161743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161598 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 19 12:12:07.161743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161708 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-gnjjt\"" Apr 19 12:12:07.161743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161740 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 19 12:12:07.161993 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161973 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 19 12:12:07.162050 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.161991 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 19 12:12:07.166611 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.166587 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 19 12:12:07.170641 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.170560 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-557f7b95bf-wpn6m"] Apr 19 12:12:07.266670 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266612 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-metrics-client-ca\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266768 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266768 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-serving-certs-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266811 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266838 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-federate-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266946 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.266946 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gddtr\" (UniqueName: \"kubernetes.io/projected/fb0976d2-6114-4151-ad73-08b849a869aa-kube-api-access-gddtr\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.267010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.266949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-metrics-client-ca\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-serving-certs-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367813 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-federate-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.367904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gddtr\" (UniqueName: \"kubernetes.io/projected/fb0976d2-6114-4151-ad73-08b849a869aa-kube-api-access-gddtr\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.368232 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.367914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.368564 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.368440 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-metrics-client-ca\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.368564 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.368545 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.369006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.368972 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0976d2-6114-4151-ad73-08b849a869aa-serving-certs-ca-bundle\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.370892 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.370865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-telemeter-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.370997 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.370913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-federate-client-tls\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.370997 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.370930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.371105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.371005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fb0976d2-6114-4151-ad73-08b849a869aa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.375192 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.375172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gddtr\" (UniqueName: \"kubernetes.io/projected/fb0976d2-6114-4151-ad73-08b849a869aa-kube-api-access-gddtr\") pod \"telemeter-client-557f7b95bf-wpn6m\" (UID: \"fb0976d2-6114-4151-ad73-08b849a869aa\") " pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.462139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.462096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b"} Apr 19 12:12:07.462139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.462139 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0"} Apr 19 12:12:07.462319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.462151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5"} Apr 19 12:12:07.462319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.462162 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b"} Apr 19 12:12:07.462319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.462172 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2"} Apr 19 12:12:07.481937 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.481912 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" Apr 19 12:12:07.611480 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:07.611456 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-557f7b95bf-wpn6m"] Apr 19 12:12:07.614314 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:07.614288 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0976d2_6114_4151_ad73_08b849a869aa.slice/crio-8d14a9cb3e5ec1e634b078f556fc8eccd499d380f9fba3f5612082ed33831fa8 WatchSource:0}: Error finding container 8d14a9cb3e5ec1e634b078f556fc8eccd499d380f9fba3f5612082ed33831fa8: Status 404 returned error can't find the container with id 8d14a9cb3e5ec1e634b078f556fc8eccd499d380f9fba3f5612082ed33831fa8 Apr 19 12:12:08.469951 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:08.469895 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerStarted","Data":"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d"} Apr 19 12:12:08.471251 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:08.471218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" event={"ID":"fb0976d2-6114-4151-ad73-08b849a869aa","Type":"ContainerStarted","Data":"8d14a9cb3e5ec1e634b078f556fc8eccd499d380f9fba3f5612082ed33831fa8"} Apr 19 12:12:08.507650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:08.500520 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.008526123 podStartE2EDuration="6.50050353s" podCreationTimestamp="2026-04-19 12:12:02 +0000 UTC" firstStartedPulling="2026-04-19 12:12:03.468532068 +0000 UTC m=+142.187356202" lastFinishedPulling="2026-04-19 12:12:07.960509462 +0000 UTC m=+146.679333609" observedRunningTime="2026-04-19 12:12:08.496368442 +0000 UTC m=+147.215192600" watchObservedRunningTime="2026-04-19 12:12:08.50050353 +0000 UTC m=+147.219327687" Apr 19 12:12:09.476345 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:09.476312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" event={"ID":"fb0976d2-6114-4151-ad73-08b849a869aa","Type":"ContainerStarted","Data":"9f645edeb664c02d5ffcff04bb1508086b6c380574f9c7bc07ca7bc4a3b96635"} Apr 19 12:12:10.480678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:10.480640 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" event={"ID":"fb0976d2-6114-4151-ad73-08b849a869aa","Type":"ContainerStarted","Data":"398bc4d61f93820b7f87f4e6741a73c28d719fe6e3ed221a382da249d0e6d7a4"} Apr 19 12:12:10.480678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:10.480674 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" event={"ID":"fb0976d2-6114-4151-ad73-08b849a869aa","Type":"ContainerStarted","Data":"d1e3d2b4eb2f0941bf3aea7e0b3af9bd3c346b7d27b1c0456bb5331d6239f4d4"} Apr 19 12:12:10.500369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:10.500282 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-557f7b95bf-wpn6m" podStartSLOduration=1.7275163359999999 podStartE2EDuration="3.500269633s" podCreationTimestamp="2026-04-19 12:12:07 +0000 UTC" firstStartedPulling="2026-04-19 12:12:07.616254694 +0000 UTC m=+146.335078830" lastFinishedPulling="2026-04-19 12:12:09.38900798 +0000 UTC m=+148.107832127" observedRunningTime="2026-04-19 12:12:10.499744991 +0000 UTC m=+149.218569146" watchObservedRunningTime="2026-04-19 12:12:10.500269633 +0000 UTC m=+149.219093788" Apr 19 12:12:13.282271 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.282233 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:12:13.286542 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.286521 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.288929 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.288910 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:12:13.289791 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.289772 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfhvt\"" Apr 19 12:12:13.289885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.289776 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:12:13.289885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.289805 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:12:13.289885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.289777 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:12:13.290178 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.290166 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:12:13.290220 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.290176 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:12:13.290220 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.290194 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:12:13.293745 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.293727 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:12:13.423712 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.423674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.424100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.423790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.424100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.423843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.424100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.423919 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.424100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.424003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpxmj\" (UniqueName: \"kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.424100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.424027 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525407 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525690 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525477 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525690 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpxmj\" (UniqueName: \"kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525789 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.525942 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.525920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.526321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.526302 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.526438 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.526418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.526803 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.526787 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.528373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.528352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.528542 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.528521 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.533083 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.533041 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpxmj\" (UniqueName: \"kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj\") pod \"console-64d9969655-xjn7b\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.596991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.596959 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:13.709745 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:13.709703 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:12:13.712262 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:13.712229 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16abf6b0_046b_40c3_9ccc_ca5fdd70e81d.slice/crio-ecaca63b4772838ced1b0597cc295765ab327739c3a7c336ab299fe8b623180a WatchSource:0}: Error finding container ecaca63b4772838ced1b0597cc295765ab327739c3a7c336ab299fe8b623180a: Status 404 returned error can't find the container with id ecaca63b4772838ced1b0597cc295765ab327739c3a7c336ab299fe8b623180a Apr 19 12:12:14.494957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:14.494924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d9969655-xjn7b" event={"ID":"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d","Type":"ContainerStarted","Data":"ecaca63b4772838ced1b0597cc295765ab327739c3a7c336ab299fe8b623180a"} Apr 19 12:12:17.504612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:17.504577 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d9969655-xjn7b" event={"ID":"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d","Type":"ContainerStarted","Data":"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5"} Apr 19 12:12:17.522047 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:17.522006 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d9969655-xjn7b" podStartSLOduration=1.388903475 podStartE2EDuration="4.521991569s" podCreationTimestamp="2026-04-19 12:12:13 +0000 UTC" firstStartedPulling="2026-04-19 12:12:13.714197302 +0000 UTC m=+152.433021435" lastFinishedPulling="2026-04-19 12:12:16.847285389 +0000 UTC m=+155.566109529" observedRunningTime="2026-04-19 12:12:17.519485681 +0000 UTC m=+156.238309836" watchObservedRunningTime="2026-04-19 12:12:17.521991569 +0000 UTC m=+156.240815723" Apr 19 12:12:17.637000 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:12:17.636961 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" Apr 19 12:12:17.645070 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:12:17.645041 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9xxkb" podUID="15abc5b3-a4e0-41a2-b57d-ee187b37cd52" Apr 19 12:12:17.758216 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:12:17.758141 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kzhlq" podUID="9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2" Apr 19 12:12:18.507600 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:18.507517 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:12:18.507600 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:18.507557 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:18.508037 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:18.507612 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:22.510853 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.510823 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:22.511321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.510867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:12:22.513140 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.513114 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"image-registry-5f5598bc7f-hrzcz\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:22.513239 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.513149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15abc5b3-a4e0-41a2-b57d-ee187b37cd52-cert\") pod \"ingress-canary-9xxkb\" (UID: \"15abc5b3-a4e0-41a2-b57d-ee187b37cd52\") " pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:12:22.612100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.612074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:22.614101 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.614075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2-metrics-tls\") pod \"dns-default-kzhlq\" (UID: \"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2\") " pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:22.711688 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.711621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rh992\"" Apr 19 12:12:22.711688 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.711621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:12:22.711864 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.711702 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:12:22.718846 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.718828 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9xxkb" Apr 19 12:12:22.718846 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.718848 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:22.719013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.718827 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:22.864678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:22.864646 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kzhlq"] Apr 19 12:12:22.869171 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:22.869140 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5e2955_23ce_47dd_b5ce_f68dd7cd9be2.slice/crio-0e2a526a893befe4f049659071dd4d39df8fa19822f8f1e417603b1578ce6d41 WatchSource:0}: Error finding container 0e2a526a893befe4f049659071dd4d39df8fa19822f8f1e417603b1578ce6d41: Status 404 returned error can't find the container with id 0e2a526a893befe4f049659071dd4d39df8fa19822f8f1e417603b1578ce6d41 Apr 19 12:12:23.083159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.083064 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9xxkb"] Apr 19 12:12:23.086517 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:23.086485 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15abc5b3_a4e0_41a2_b57d_ee187b37cd52.slice/crio-a0d8e4745b344723f627773594207ad7170f5a5c09d9adc5166e3eca71684811 WatchSource:0}: Error finding container a0d8e4745b344723f627773594207ad7170f5a5c09d9adc5166e3eca71684811: Status 404 returned error can't find the container with id a0d8e4745b344723f627773594207ad7170f5a5c09d9adc5166e3eca71684811 Apr 19 12:12:23.087182 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.087086 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:12:23.089616 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:12:23.089598 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e0d7319_d55e_4c55_b262_2e01232a5a5c.slice/crio-1badd1ef617ad1fb87d138d72bb2f229154ef052962abfd71698077091604d7c WatchSource:0}: Error finding container 1badd1ef617ad1fb87d138d72bb2f229154ef052962abfd71698077091604d7c: Status 404 returned error can't find the container with id 1badd1ef617ad1fb87d138d72bb2f229154ef052962abfd71698077091604d7c Apr 19 12:12:23.525298 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.525262 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9xxkb" event={"ID":"15abc5b3-a4e0-41a2-b57d-ee187b37cd52","Type":"ContainerStarted","Data":"a0d8e4745b344723f627773594207ad7170f5a5c09d9adc5166e3eca71684811"} Apr 19 12:12:23.526390 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.526361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kzhlq" event={"ID":"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2","Type":"ContainerStarted","Data":"0e2a526a893befe4f049659071dd4d39df8fa19822f8f1e417603b1578ce6d41"} Apr 19 12:12:23.527943 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.527921 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" event={"ID":"0e0d7319-d55e-4c55-b262-2e01232a5a5c","Type":"ContainerStarted","Data":"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca"} Apr 19 12:12:23.528045 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.527949 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" event={"ID":"0e0d7319-d55e-4c55-b262-2e01232a5a5c","Type":"ContainerStarted","Data":"1badd1ef617ad1fb87d138d72bb2f229154ef052962abfd71698077091604d7c"} Apr 19 12:12:23.528108 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.528061 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:23.547455 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.547414 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" podStartSLOduration=161.547396727 podStartE2EDuration="2m41.547396727s" podCreationTimestamp="2026-04-19 12:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:12:23.545975751 +0000 UTC m=+162.264799906" watchObservedRunningTime="2026-04-19 12:12:23.547396727 +0000 UTC m=+162.266220883" Apr 19 12:12:23.597660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.597604 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:23.597844 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.597755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:12:23.599242 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.599208 2568 patch_prober.go:28] interesting pod/console-64d9969655-xjn7b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" start-of-body= Apr 19 12:12:23.599349 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:23.599257 2568 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d9969655-xjn7b" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" probeResult="failure" output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" Apr 19 12:12:25.537891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.537797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9xxkb" event={"ID":"15abc5b3-a4e0-41a2-b57d-ee187b37cd52","Type":"ContainerStarted","Data":"16521a5880c11903b4d28f2539e55251808ff6a97298287e966fb8afb37ca6fa"} Apr 19 12:12:25.539305 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.539282 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kzhlq" event={"ID":"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2","Type":"ContainerStarted","Data":"c1bff8242a5436e7de09f04dbd01fd2842feef810751a0cf163e0658a0282d82"} Apr 19 12:12:25.539408 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.539312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kzhlq" event={"ID":"9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2","Type":"ContainerStarted","Data":"175f793238ac802e2ab07b2ff0819fd69f1b200ac8271e392ed8a545b911b8f5"} Apr 19 12:12:25.539457 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.539443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:25.552468 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.552432 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9xxkb" podStartSLOduration=129.389676175 podStartE2EDuration="2m11.552421736s" podCreationTimestamp="2026-04-19 12:10:14 +0000 UTC" firstStartedPulling="2026-04-19 12:12:23.088743728 +0000 UTC m=+161.807567865" lastFinishedPulling="2026-04-19 12:12:25.251489276 +0000 UTC m=+163.970313426" observedRunningTime="2026-04-19 12:12:25.551005407 +0000 UTC m=+164.269829563" watchObservedRunningTime="2026-04-19 12:12:25.552421736 +0000 UTC m=+164.271245891" Apr 19 12:12:25.565345 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:25.565296 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kzhlq" podStartSLOduration=129.702132302 podStartE2EDuration="2m11.565280568s" podCreationTimestamp="2026-04-19 12:10:14 +0000 UTC" firstStartedPulling="2026-04-19 12:12:22.87126075 +0000 UTC m=+161.590084883" lastFinishedPulling="2026-04-19 12:12:24.734409003 +0000 UTC m=+163.453233149" observedRunningTime="2026-04-19 12:12:25.565164179 +0000 UTC m=+164.283988336" watchObservedRunningTime="2026-04-19 12:12:25.565280568 +0000 UTC m=+164.284104723" Apr 19 12:12:33.597713 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:33.597681 2568 patch_prober.go:28] interesting pod/console-64d9969655-xjn7b container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" start-of-body= Apr 19 12:12:33.598144 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:33.597749 2568 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-64d9969655-xjn7b" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" probeResult="failure" output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" Apr 19 12:12:35.544564 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:35.544532 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kzhlq" Apr 19 12:12:41.586003 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:41.585969 2568 generic.go:358] "Generic (PLEG): container finished" podID="a027b65b-1aeb-4b64-ba23-6687e4bcf69b" containerID="967c885d29b7460550c40eba1c9717cefcf72131b117eb7e9ddd6c106709dfb6" exitCode=0 Apr 19 12:12:41.586359 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:41.586044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" event={"ID":"a027b65b-1aeb-4b64-ba23-6687e4bcf69b","Type":"ContainerDied","Data":"967c885d29b7460550c40eba1c9717cefcf72131b117eb7e9ddd6c106709dfb6"} Apr 19 12:12:41.586400 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:41.586371 2568 scope.go:117] "RemoveContainer" containerID="967c885d29b7460550c40eba1c9717cefcf72131b117eb7e9ddd6c106709dfb6" Apr 19 12:12:42.589883 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:42.589853 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-6k6qf" event={"ID":"a027b65b-1aeb-4b64-ba23-6687e4bcf69b","Type":"ContainerStarted","Data":"dcf345797790b2df9a7c010d9154339feccc8713da691db5f981fd3661d36fb1"} Apr 19 12:12:42.723096 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:42.723066 2568 patch_prober.go:28] interesting pod/image-registry-5f5598bc7f-hrzcz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:42.723249 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:42.723113 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:43.234987 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:43.234954 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:12:44.537355 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:44.537325 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:12:44.911728 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:44.911651 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:12:50.991354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:50.991323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kzhlq_9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2/dns/0.log" Apr 19 12:12:51.192263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:51.192232 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kzhlq_9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2/kube-rbac-proxy/0.log" Apr 19 12:12:51.592020 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:51.591996 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h8wxf_9920f4d8-e6b0-4993-baa5-e254915bebae/dns-node-resolver/0.log" Apr 19 12:12:51.992534 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:51.992504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57d6b784d-fjqwn_62ce91ba-1d6b-4aae-9fc2-6ef69f90a963/router/0.log" Apr 19 12:12:52.191666 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:12:52.191620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9xxkb_15abc5b3-a4e0-41a2-b57d-ee187b37cd52/serve-healthcheck-canary/0.log" Apr 19 12:13:08.253400 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.253339 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64d9969655-xjn7b" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" containerID="cri-o://ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5" gracePeriod=15 Apr 19 12:13:08.497303 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.497277 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d9969655-xjn7b_16abf6b0-046b-40c3-9ccc-ca5fdd70e81d/console/0.log" Apr 19 12:13:08.497442 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.497351 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:13:08.577664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577565 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.577664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577601 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.577664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577647 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.577664 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577664 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.577985 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577864 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpxmj\" (UniqueName: \"kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.577985 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.577928 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert\") pod \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\" (UID: \"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d\") " Apr 19 12:13:08.578084 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578058 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config" (OuterVolumeSpecName: "console-config") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:08.578084 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578071 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca" (OuterVolumeSpecName: "service-ca") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:08.578286 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578262 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:08.578385 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578367 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-service-ca\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.578449 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578392 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.578449 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.578406 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-oauth-serving-cert\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.579943 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.579920 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:08.580038 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.579939 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj" (OuterVolumeSpecName: "kube-api-access-rpxmj") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "kube-api-access-rpxmj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:08.580038 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.579947 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" (UID: "16abf6b0-046b-40c3-9ccc-ca5fdd70e81d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:08.676800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d9969655-xjn7b_16abf6b0-046b-40c3-9ccc-ca5fdd70e81d/console/0.log" Apr 19 12:13:08.676972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676814 2568 generic.go:358] "Generic (PLEG): container finished" podID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerID="ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5" exitCode=2 Apr 19 12:13:08.676972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676880 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d9969655-xjn7b" Apr 19 12:13:08.676972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676898 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d9969655-xjn7b" event={"ID":"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d","Type":"ContainerDied","Data":"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5"} Apr 19 12:13:08.676972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676934 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d9969655-xjn7b" event={"ID":"16abf6b0-046b-40c3-9ccc-ca5fdd70e81d","Type":"ContainerDied","Data":"ecaca63b4772838ced1b0597cc295765ab327739c3a7c336ab299fe8b623180a"} Apr 19 12:13:08.676972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.676951 2568 scope.go:117] "RemoveContainer" containerID="ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5" Apr 19 12:13:08.678961 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.678937 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-serving-cert\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.679077 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.678967 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-console-oauth-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.679077 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.678981 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpxmj\" (UniqueName: \"kubernetes.io/projected/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d-kube-api-access-rpxmj\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:08.686389 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.686371 2568 scope.go:117] "RemoveContainer" containerID="ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5" Apr 19 12:13:08.686674 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:08.686656 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5\": container with ID starting with ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5 not found: ID does not exist" containerID="ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5" Apr 19 12:13:08.686749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.686682 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5"} err="failed to get container status \"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5\": rpc error: code = NotFound desc = could not find container \"ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5\": container with ID starting with ae888b70143baa2845f222d958c9e6bda0d10623d90ab187c40cbe7b97db2fe5 not found: ID does not exist" Apr 19 12:13:08.696447 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.696424 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:13:08.702538 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:08.702515 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d9969655-xjn7b"] Apr 19 12:13:09.868130 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:09.868098 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" path="/var/lib/kubelet/pods/16abf6b0-046b-40c3-9ccc-ca5fdd70e81d/volumes" Apr 19 12:13:09.930105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:09.930071 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerName="registry" containerID="cri-o://680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca" gracePeriod=30 Apr 19 12:13:11.167288 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.167266 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:13:11.298922 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.298849 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.298922 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.298892 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.298934 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.298962 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.298980 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.299023 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.299069 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjctz\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.299100 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted\") pod \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\" (UID: \"0e0d7319-d55e-4c55-b262-2e01232a5a5c\") " Apr 19 12:13:11.299408 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.299363 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:11.299525 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.299442 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:11.301325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.301289 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:11.301435 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.301347 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:11.301539 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.301516 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:11.301779 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.301757 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz" (OuterVolumeSpecName: "kube-api-access-zjctz") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "kube-api-access-zjctz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:11.301834 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.301757 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:11.307964 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.307938 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0e0d7319-d55e-4c55-b262-2e01232a5a5c" (UID: "0e0d7319-d55e-4c55-b262-2e01232a5a5c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:13:11.400175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400148 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-certificates\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400177 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-registry-tls\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400191 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-image-registry-private-configuration\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400204 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjctz\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-kube-api-access-zjctz\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400217 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0e0d7319-d55e-4c55-b262-2e01232a5a5c-ca-trust-extracted\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400231 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0e0d7319-d55e-4c55-b262-2e01232a5a5c-trusted-ca\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400243 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0e0d7319-d55e-4c55-b262-2e01232a5a5c-installation-pull-secrets\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.400336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.400255 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0e0d7319-d55e-4c55-b262-2e01232a5a5c-bound-sa-token\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:11.687907 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.687871 2568 generic.go:358] "Generic (PLEG): container finished" podID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerID="680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca" exitCode=0 Apr 19 12:13:11.688059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.687926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" event={"ID":"0e0d7319-d55e-4c55-b262-2e01232a5a5c","Type":"ContainerDied","Data":"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca"} Apr 19 12:13:11.688059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.687956 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" event={"ID":"0e0d7319-d55e-4c55-b262-2e01232a5a5c","Type":"ContainerDied","Data":"1badd1ef617ad1fb87d138d72bb2f229154ef052962abfd71698077091604d7c"} Apr 19 12:13:11.688059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.687965 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f5598bc7f-hrzcz" Apr 19 12:13:11.688059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.687976 2568 scope.go:117] "RemoveContainer" containerID="680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca" Apr 19 12:13:11.696108 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.696088 2568 scope.go:117] "RemoveContainer" containerID="680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca" Apr 19 12:13:11.696348 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:11.696332 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca\": container with ID starting with 680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca not found: ID does not exist" containerID="680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca" Apr 19 12:13:11.696392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.696355 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca"} err="failed to get container status \"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca\": rpc error: code = NotFound desc = could not find container \"680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca\": container with ID starting with 680afc5eaa45c1fa6ad1c27bae17d35f986960a078ef0f573e35674ccd0783ca not found: ID does not exist" Apr 19 12:13:11.707424 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.707400 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:13:11.710365 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.710345 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5f5598bc7f-hrzcz"] Apr 19 12:13:11.868139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:11.868110 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" path="/var/lib/kubelet/pods/0e0d7319-d55e-4c55-b262-2e01232a5a5c/volumes" Apr 19 12:13:22.218009 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.217982 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:22.218417 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218373 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="alertmanager" containerID="cri-o://b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2" gracePeriod=120 Apr 19 12:13:22.218489 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218404 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-metric" containerID="cri-o://f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b" gracePeriod=120 Apr 19 12:13:22.218489 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218438 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-web" containerID="cri-o://7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5" gracePeriod=120 Apr 19 12:13:22.218604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218483 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="config-reloader" containerID="cri-o://ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b" gracePeriod=120 Apr 19 12:13:22.218604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218432 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="prom-label-proxy" containerID="cri-o://8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d" gracePeriod=120 Apr 19 12:13:22.218604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.218552 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy" containerID="cri-o://f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0" gracePeriod=120 Apr 19 12:13:22.722367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722339 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d" exitCode=0 Apr 19 12:13:22.722367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722363 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b" exitCode=0 Apr 19 12:13:22.722367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722370 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0" exitCode=0 Apr 19 12:13:22.722367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722375 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b" exitCode=0 Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722382 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2" exitCode=0 Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d"} Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722440 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b"} Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722450 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0"} Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b"} Apr 19 12:13:22.722650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:22.722468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2"} Apr 19 12:13:23.458316 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.458295 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.607049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.606969 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607003 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607021 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607039 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607059 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607087 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607114 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607159 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9sm9\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607183 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607236 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607263 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607290 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume\") pod \"2841affb-5791-4e66-bbba-2b764530ccc4\" (UID: \"2841affb-5791-4e66-bbba-2b764530ccc4\") " Apr 19 12:13:23.607832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607407 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:13:23.607832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607466 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:23.607832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607601 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-main-db\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.607832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.607653 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.609556 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.609520 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.609696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.609619 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.609802 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.609767 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.610208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.610090 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:13:23.610208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.610125 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume" (OuterVolumeSpecName: "config-volume") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.610208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.610141 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.610208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.610181 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:23.610529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.610500 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9" (OuterVolumeSpecName: "kube-api-access-p9sm9") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "kube-api-access-p9sm9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:13:23.611757 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.611735 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out" (OuterVolumeSpecName: "config-out") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:13:23.614473 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.614448 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.620931 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.620901 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config" (OuterVolumeSpecName: "web-config") pod "2841affb-5791-4e66-bbba-2b764530ccc4" (UID: "2841affb-5791-4e66-bbba-2b764530ccc4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:13:23.708769 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708736 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-cluster-tls-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708769 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708766 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-tls-assets\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708778 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708789 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9sm9\" (UniqueName: \"kubernetes.io/projected/2841affb-5791-4e66-bbba-2b764530ccc4-kube-api-access-p9sm9\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708798 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-main-tls\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708807 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2841affb-5791-4e66-bbba-2b764530ccc4-metrics-client-ca\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708817 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-web-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708826 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2841affb-5791-4e66-bbba-2b764530ccc4-config-out\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708834 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-config-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708842 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.708967 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.708851 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2841affb-5791-4e66-bbba-2b764530ccc4-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:13:23.727502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.727471 2568 generic.go:358] "Generic (PLEG): container finished" podID="2841affb-5791-4e66-bbba-2b764530ccc4" containerID="7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5" exitCode=0 Apr 19 12:13:23.727611 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.727561 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5"} Apr 19 12:13:23.727611 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.727599 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2841affb-5791-4e66-bbba-2b764530ccc4","Type":"ContainerDied","Data":"3f25ac19e31229d5bed764ee30d2cad63e92dcad07c46013448c80290ea73a79"} Apr 19 12:13:23.727715 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.727615 2568 scope.go:117] "RemoveContainer" containerID="8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d" Apr 19 12:13:23.727715 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.727573 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.735268 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.735251 2568 scope.go:117] "RemoveContainer" containerID="f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b" Apr 19 12:13:23.742276 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.742257 2568 scope.go:117] "RemoveContainer" containerID="f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0" Apr 19 12:13:23.749062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.749046 2568 scope.go:117] "RemoveContainer" containerID="7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5" Apr 19 12:13:23.751118 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.751098 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:23.756452 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.756435 2568 scope.go:117] "RemoveContainer" containerID="ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b" Apr 19 12:13:23.757267 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.757246 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:23.763034 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.763017 2568 scope.go:117] "RemoveContainer" containerID="b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2" Apr 19 12:13:23.769503 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.769488 2568 scope.go:117] "RemoveContainer" containerID="2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5" Apr 19 12:13:23.775818 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.775800 2568 scope.go:117] "RemoveContainer" containerID="8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d" Apr 19 12:13:23.776066 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.776046 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d\": container with ID starting with 8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d not found: ID does not exist" containerID="8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d" Apr 19 12:13:23.776172 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.776079 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d"} err="failed to get container status \"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d\": rpc error: code = NotFound desc = could not find container \"8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d\": container with ID starting with 8e3646310c5d37c750e73bbdf24e6dd126bbc33ef23573ae57cdca06b652c78d not found: ID does not exist" Apr 19 12:13:23.776270 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.776207 2568 scope.go:117] "RemoveContainer" containerID="f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b" Apr 19 12:13:23.776565 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.776529 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b\": container with ID starting with f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b not found: ID does not exist" containerID="f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b" Apr 19 12:13:23.776726 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.776573 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b"} err="failed to get container status \"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b\": rpc error: code = NotFound desc = could not find container \"f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b\": container with ID starting with f7525601388e8452109fc45ef8488d1a43163637022e08d2054d77031ec8ed8b not found: ID does not exist" Apr 19 12:13:23.776726 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.776594 2568 scope.go:117] "RemoveContainer" containerID="f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0" Apr 19 12:13:23.776984 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.776966 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0\": container with ID starting with f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0 not found: ID does not exist" containerID="f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0" Apr 19 12:13:23.777062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.776990 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0"} err="failed to get container status \"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0\": rpc error: code = NotFound desc = could not find container \"f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0\": container with ID starting with f522fb0818a300b938631c36e30b9d91dd26e965aae03393df4c72c0416b77b0 not found: ID does not exist" Apr 19 12:13:23.777062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.777010 2568 scope.go:117] "RemoveContainer" containerID="7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5" Apr 19 12:13:23.777358 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.777334 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5\": container with ID starting with 7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5 not found: ID does not exist" containerID="7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5" Apr 19 12:13:23.777436 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.777366 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5"} err="failed to get container status \"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5\": rpc error: code = NotFound desc = could not find container \"7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5\": container with ID starting with 7bbb01d4739881bffd5989e8e09e827e4e5d004d0aa307f88c899c50c9aec3a5 not found: ID does not exist" Apr 19 12:13:23.777436 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.777387 2568 scope.go:117] "RemoveContainer" containerID="ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b" Apr 19 12:13:23.777707 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.777688 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b\": container with ID starting with ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b not found: ID does not exist" containerID="ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b" Apr 19 12:13:23.777774 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.777713 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b"} err="failed to get container status \"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b\": rpc error: code = NotFound desc = could not find container \"ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b\": container with ID starting with ec2414f04de561ea174e739a840738a26a6ea05846a1f81a0d60aed769f4814b not found: ID does not exist" Apr 19 12:13:23.777774 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.777735 2568 scope.go:117] "RemoveContainer" containerID="b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2" Apr 19 12:13:23.778006 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.777989 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2\": container with ID starting with b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2 not found: ID does not exist" containerID="b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2" Apr 19 12:13:23.778076 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778013 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2"} err="failed to get container status \"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2\": rpc error: code = NotFound desc = could not find container \"b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2\": container with ID starting with b148cd5cf2dad5ed5e69ed10a5b880865aa60fa9ba9400ad47a2c1d6f28318c2 not found: ID does not exist" Apr 19 12:13:23.778076 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778032 2568 scope.go:117] "RemoveContainer" containerID="2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5" Apr 19 12:13:23.778297 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:13:23.778280 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5\": container with ID starting with 2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5 not found: ID does not exist" containerID="2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5" Apr 19 12:13:23.778366 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778304 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5"} err="failed to get container status \"2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5\": rpc error: code = NotFound desc = could not find container \"2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5\": container with ID starting with 2d49b99c7fc97de63df709d902dd0d42a4649ef4d681b4e469d06fe4da1e72f5 not found: ID does not exist" Apr 19 12:13:23.778448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778432 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:23.778756 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778743 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="init-config-reloader" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778758 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="init-config-reloader" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778774 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="config-reloader" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778779 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="config-reloader" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778785 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerName="registry" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778790 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerName="registry" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778796 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" Apr 19 12:13:23.778809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778801 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778813 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778819 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778827 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="alertmanager" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778834 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="alertmanager" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778843 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-metric" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778849 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-metric" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778855 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-web" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778860 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-web" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778866 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="prom-label-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778871 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="prom-label-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778914 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778924 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="alertmanager" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778931 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-web" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778936 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="kube-rbac-proxy-metric" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778943 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="prom-label-proxy" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778949 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="16abf6b0-046b-40c3-9ccc-ca5fdd70e81d" containerName="console" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778958 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e0d7319-d55e-4c55-b262-2e01232a5a5c" containerName="registry" Apr 19 12:13:23.779018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.778963 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" containerName="config-reloader" Apr 19 12:13:23.783950 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.783933 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787211 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787298 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787327 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787369 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 19 12:13:23.787485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787214 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 19 12:13:23.787837 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787492 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 19 12:13:23.787837 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.787538 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 19 12:13:23.788362 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.788342 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4k858\"" Apr 19 12:13:23.792313 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.792171 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 19 12:13:23.793141 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.793121 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:23.869899 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.869826 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2841affb-5791-4e66-bbba-2b764530ccc4" path="/var/lib/kubelet/pods/2841affb-5791-4e66-bbba-2b764530ccc4/volumes" Apr 19 12:13:23.910215 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910346 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910346 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-out\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910346 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910275 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-web-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910346 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910370 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4thc\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-kube-api-access-l4thc\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910647 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910647 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910647 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:23.910647 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:23.910588 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-volume\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-volume\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011427 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011427 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-out\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011427 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-web-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011543 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4thc\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-kube-api-access-l4thc\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011855 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.011996 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.011968 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.012262 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.012244 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014222 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-out\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014225 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-web-config\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014341 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-config-volume\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014863 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014700 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.014863 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.014744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.015037 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.015019 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.015451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.015427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.016060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.016044 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.021349 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.021331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4thc\" (UniqueName: \"kubernetes.io/projected/31684d9c-023e-4de7-a35c-c02cfb7c0b4f-kube-api-access-l4thc\") pod \"alertmanager-main-0\" (UID: \"31684d9c-023e-4de7-a35c-c02cfb7c0b4f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.094114 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.094087 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 19 12:13:24.216169 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.216125 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 19 12:13:24.217779 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:13:24.217755 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31684d9c_023e_4de7_a35c_c02cfb7c0b4f.slice/crio-657b60de5dafa64a8b213b428ec0068a38a59cbc251b46467bacd3980a6504a3 WatchSource:0}: Error finding container 657b60de5dafa64a8b213b428ec0068a38a59cbc251b46467bacd3980a6504a3: Status 404 returned error can't find the container with id 657b60de5dafa64a8b213b428ec0068a38a59cbc251b46467bacd3980a6504a3 Apr 19 12:13:24.732503 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.732467 2568 generic.go:358] "Generic (PLEG): container finished" podID="31684d9c-023e-4de7-a35c-c02cfb7c0b4f" containerID="7d9ca5b4e91de6b77aa2a454b0f0e524839fa3ba6348faf080bbea7323aa1403" exitCode=0 Apr 19 12:13:24.732885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.732526 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerDied","Data":"7d9ca5b4e91de6b77aa2a454b0f0e524839fa3ba6348faf080bbea7323aa1403"} Apr 19 12:13:24.732885 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:24.732551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"657b60de5dafa64a8b213b428ec0068a38a59cbc251b46467bacd3980a6504a3"} Apr 19 12:13:25.738453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"2fd66a24f1d4c879bc3b7475d9c39b13335d6a420090297dc3c605fbcbbd727c"} Apr 19 12:13:25.738453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738448 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"7244d5ac6289a531b8d7f07c79f07c58f5c9b3f60debde3895642f06e7d5af67"} Apr 19 12:13:25.738453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"5fdb72682cb3d8f77d6ca45a1774f2293cfba0961c2b3cf73cdc196d4b6bb7dd"} Apr 19 12:13:25.738900 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738467 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"75a745c9a78ecfa19861081cc4e5ddbe53f62be73f50233f86e6115835189af6"} Apr 19 12:13:25.738900 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"8975ff0d1fe34244100b23c81f565d4cd5d6d299acce0fc7f665887a57ee7886"} Apr 19 12:13:25.738900 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.738483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"31684d9c-023e-4de7-a35c-c02cfb7c0b4f","Type":"ContainerStarted","Data":"511f935810701cf7e2576dbb87eea9588d0e11f830a626b5cb6f0f59c486e9cf"} Apr 19 12:13:25.761697 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:25.761657 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.76164386 podStartE2EDuration="2.76164386s" podCreationTimestamp="2026-04-19 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:13:25.760126144 +0000 UTC m=+224.478950299" watchObservedRunningTime="2026-04-19 12:13:25.76164386 +0000 UTC m=+224.480468006" Apr 19 12:13:37.141568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.141530 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:13:37.146196 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.146172 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.148641 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.148609 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:13:37.148735 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.148663 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:13:37.148774 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.148735 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:13:37.148812 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.148788 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:13:37.149879 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.149860 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:13:37.150051 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.150032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:13:37.150213 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.150186 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfhvt\"" Apr 19 12:13:37.150384 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.150369 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:13:37.153387 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.153358 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:13:37.154249 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.154226 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 12:13:37.313812 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.313779 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314007 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.313819 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314007 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.313847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rz2v\" (UniqueName: \"kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314007 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.313922 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314007 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.313970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.314018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.314167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.314048 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414558 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414723 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414750 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414774 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rz2v\" (UniqueName: \"kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.414895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.414831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.415496 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.415464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.415649 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.415526 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.415649 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.415557 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.415782 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.415651 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.417150 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.417128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.417150 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.417144 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.422665 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.422620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rz2v\" (UniqueName: \"kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v\") pod \"console-779499566f-wd8rw\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.457584 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.457561 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:37.576837 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.576814 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:13:37.579363 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:13:37.579335 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4505d4d_1a94_46df_9a91_58f9beb22fe2.slice/crio-bc345aaed6dffd98f63466e12642566a1f9c3e446405bc7073b21e615f90575f WatchSource:0}: Error finding container bc345aaed6dffd98f63466e12642566a1f9c3e446405bc7073b21e615f90575f: Status 404 returned error can't find the container with id bc345aaed6dffd98f63466e12642566a1f9c3e446405bc7073b21e615f90575f Apr 19 12:13:37.773844 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.773805 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779499566f-wd8rw" event={"ID":"c4505d4d-1a94-46df-9a91-58f9beb22fe2","Type":"ContainerStarted","Data":"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e"} Apr 19 12:13:37.773844 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.773839 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779499566f-wd8rw" event={"ID":"c4505d4d-1a94-46df-9a91-58f9beb22fe2","Type":"ContainerStarted","Data":"bc345aaed6dffd98f63466e12642566a1f9c3e446405bc7073b21e615f90575f"} Apr 19 12:13:37.790567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:37.790523 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-779499566f-wd8rw" podStartSLOduration=0.790500968 podStartE2EDuration="790.500968ms" podCreationTimestamp="2026-04-19 12:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:13:37.788768403 +0000 UTC m=+236.507592570" watchObservedRunningTime="2026-04-19 12:13:37.790500968 +0000 UTC m=+236.509325123" Apr 19 12:13:47.457911 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:47.457878 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:47.458352 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:47.457955 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:47.462532 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:47.462502 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:13:47.808576 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:13:47.808509 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:14:35.253433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.253398 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg"] Apr 19 12:14:35.257018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.257001 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.259662 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.259643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:14:35.260550 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.260534 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:14:35.260644 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.260537 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:14:35.267759 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.267733 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg"] Apr 19 12:14:35.363930 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.363901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.364094 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.363959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.364094 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.364058 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbnk\" (UniqueName: \"kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.465208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.465173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.465404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.465258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbnk\" (UniqueName: \"kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.465404 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.465311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.465584 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.465563 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.465715 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.465697 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.472990 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.472967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbnk\" (UniqueName: \"kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.565765 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.565715 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:35.682907 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.682879 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg"] Apr 19 12:14:35.684984 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:14:35.684955 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d24bd3_eaa5_4e7d_9f46_46058e6bb4bd.slice/crio-83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18 WatchSource:0}: Error finding container 83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18: Status 404 returned error can't find the container with id 83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18 Apr 19 12:14:35.933059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:35.932982 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerStarted","Data":"83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18"} Apr 19 12:14:42.125839 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:42.125812 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:14:42.954110 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:42.954072 2568 generic.go:358] "Generic (PLEG): container finished" podID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerID="31cdedbf16dbffe548be0c27f327530e9a9d9298e70134f9ff9ebfc2eba938bb" exitCode=0 Apr 19 12:14:42.954110 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:42.954115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerDied","Data":"31cdedbf16dbffe548be0c27f327530e9a9d9298e70134f9ff9ebfc2eba938bb"} Apr 19 12:14:42.955090 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:42.955072 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:14:45.963396 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:45.963365 2568 generic.go:358] "Generic (PLEG): container finished" podID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerID="6fff77cfb82843d0c6835332ac27a32cd9f8f1723e4312021e096f42af1d21c2" exitCode=0 Apr 19 12:14:45.963834 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:45.963458 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerDied","Data":"6fff77cfb82843d0c6835332ac27a32cd9f8f1723e4312021e096f42af1d21c2"} Apr 19 12:14:53.990204 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:53.990175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerStarted","Data":"9a85b2cb7a25c16800829c79d43fe16038d529639fea4294a13490cbe7314424"} Apr 19 12:14:54.994796 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:54.994752 2568 generic.go:358] "Generic (PLEG): container finished" podID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerID="9a85b2cb7a25c16800829c79d43fe16038d529639fea4294a13490cbe7314424" exitCode=0 Apr 19 12:14:54.995179 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:54.994811 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerDied","Data":"9a85b2cb7a25c16800829c79d43fe16038d529639fea4294a13490cbe7314424"} Apr 19 12:14:55.108742 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.108721 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:55.126420 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.126385 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kbnk\" (UniqueName: \"kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk\") pod \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " Apr 19 12:14:55.126536 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.126468 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle\") pod \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " Apr 19 12:14:55.126536 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.126493 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util\") pod \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\" (UID: \"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd\") " Apr 19 12:14:55.127106 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.127080 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle" (OuterVolumeSpecName: "bundle") pod "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" (UID: "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:14:55.128696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.128674 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk" (OuterVolumeSpecName: "kube-api-access-7kbnk") pod "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" (UID: "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd"). InnerVolumeSpecName "kube-api-access-7kbnk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:14:55.131716 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.131690 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util" (OuterVolumeSpecName: "util") pod "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" (UID: "77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:14:55.227063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.227035 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:14:55.227063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.227059 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:14:55.227063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.227069 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7kbnk\" (UniqueName: \"kubernetes.io/projected/77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd-kube-api-access-7kbnk\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:14:55.998988 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.998957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" event={"ID":"77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd","Type":"ContainerDied","Data":"83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18"} Apr 19 12:14:55.998988 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.998984 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5bg4mg" Apr 19 12:14:55.999416 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:14:55.998987 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83edc3f403b5387d5e75a4db34f8ae1d4e8e72d83dc4403af0e0ea54432f7a18" Apr 19 12:15:02.314085 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314053 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5"] Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314350 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="util" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314360 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="util" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314369 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="pull" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314375 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="pull" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314381 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="extract" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314386 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="extract" Apr 19 12:15:02.314516 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.314431 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="77d24bd3-eaa5-4e7d-9f46-46058e6bb4bd" containerName="extract" Apr 19 12:15:02.319570 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.319550 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.321947 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.321926 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 19 12:15:02.322277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.322253 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-jlnsh\"" Apr 19 12:15:02.322277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.322263 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:15:02.327301 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.327279 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5"] Apr 19 12:15:02.383463 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.383441 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f972f7-5592-4cb2-a939-efdce01886a8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.383577 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.383504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchsz\" (UniqueName: \"kubernetes.io/projected/51f972f7-5592-4cb2-a939-efdce01886a8-kube-api-access-xchsz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.484168 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.484143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xchsz\" (UniqueName: \"kubernetes.io/projected/51f972f7-5592-4cb2-a939-efdce01886a8-kube-api-access-xchsz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.484282 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.484185 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f972f7-5592-4cb2-a939-efdce01886a8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.484564 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.484546 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51f972f7-5592-4cb2-a939-efdce01886a8-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.492692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.492668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchsz\" (UniqueName: \"kubernetes.io/projected/51f972f7-5592-4cb2-a939-efdce01886a8-kube-api-access-xchsz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-h8gf5\" (UID: \"51f972f7-5592-4cb2-a939-efdce01886a8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.630155 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.630070 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" Apr 19 12:15:02.756895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:02.756871 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5"] Apr 19 12:15:02.759391 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:02.759363 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f972f7_5592_4cb2_a939_efdce01886a8.slice/crio-5e1020ceafa2d8245f1928fc6ad15b9698c23708c2e37caaddc2098a90f3948a WatchSource:0}: Error finding container 5e1020ceafa2d8245f1928fc6ad15b9698c23708c2e37caaddc2098a90f3948a: Status 404 returned error can't find the container with id 5e1020ceafa2d8245f1928fc6ad15b9698c23708c2e37caaddc2098a90f3948a Apr 19 12:15:03.020240 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:03.020205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" event={"ID":"51f972f7-5592-4cb2-a939-efdce01886a8","Type":"ContainerStarted","Data":"5e1020ceafa2d8245f1928fc6ad15b9698c23708c2e37caaddc2098a90f3948a"} Apr 19 12:15:08.039108 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:08.039074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" event={"ID":"51f972f7-5592-4cb2-a939-efdce01886a8","Type":"ContainerStarted","Data":"e7781cc2f54ed25b719d6fab441570cf56a81758e1884bf44f674019e4efbc22"} Apr 19 12:15:08.057935 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:08.057892 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-h8gf5" podStartSLOduration=0.982874269 podStartE2EDuration="6.057879391s" podCreationTimestamp="2026-04-19 12:15:02 +0000 UTC" firstStartedPulling="2026-04-19 12:15:02.761818043 +0000 UTC m=+321.480642176" lastFinishedPulling="2026-04-19 12:15:07.836823165 +0000 UTC m=+326.555647298" observedRunningTime="2026-04-19 12:15:08.057595232 +0000 UTC m=+326.776419387" watchObservedRunningTime="2026-04-19 12:15:08.057879391 +0000 UTC m=+326.776703545" Apr 19 12:15:09.425887 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.425854 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp"] Apr 19 12:15:09.429401 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.429385 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.431509 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.431478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:15:09.432406 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.432389 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:15:09.432560 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.432391 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:15:09.435955 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.435929 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp"] Apr 19 12:15:09.538088 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.538062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.538216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.538117 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.538216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.538184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j64zf\" (UniqueName: \"kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.639421 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.639390 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.639566 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.639435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j64zf\" (UniqueName: \"kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.639566 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.639505 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.639818 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.639797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.639860 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.639835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.646383 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.646357 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j64zf\" (UniqueName: \"kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.740590 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.740556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:09.884175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:09.884104 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp"] Apr 19 12:15:09.886308 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:09.886280 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod520bc679_5927_4413_941a_fd1e4fa63b12.slice/crio-408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c WatchSource:0}: Error finding container 408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c: Status 404 returned error can't find the container with id 408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c Apr 19 12:15:10.046223 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:10.046191 2568 generic.go:358] "Generic (PLEG): container finished" podID="520bc679-5927-4413-941a-fd1e4fa63b12" containerID="d029bc1b47540f2bbcffc4c01a78b64b7823e89d0633a434f70a4dfc19b6ea9b" exitCode=0 Apr 19 12:15:10.046332 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:10.046241 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" event={"ID":"520bc679-5927-4413-941a-fd1e4fa63b12","Type":"ContainerDied","Data":"d029bc1b47540f2bbcffc4c01a78b64b7823e89d0633a434f70a4dfc19b6ea9b"} Apr 19 12:15:10.046332 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:10.046263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" event={"ID":"520bc679-5927-4413-941a-fd1e4fa63b12","Type":"ContainerStarted","Data":"408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c"} Apr 19 12:15:11.537564 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.537532 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc8rz"] Apr 19 12:15:11.540853 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.540829 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.543368 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.543336 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 12:15:11.543368 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.543354 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 12:15:11.544187 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.544167 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-s4kvz\"" Apr 19 12:15:11.551173 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.551153 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc8rz"] Apr 19 12:15:11.659385 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.659354 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.659531 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.659424 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vctbv\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-kube-api-access-vctbv\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.760521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.760491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.760684 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.760557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vctbv\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-kube-api-access-vctbv\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.769096 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.769069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.769319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.769299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vctbv\" (UniqueName: \"kubernetes.io/projected/5c2425b0-9662-4819-9926-4de7b26ecd00-kube-api-access-vctbv\") pod \"cert-manager-webhook-597b96b99b-jc8rz\" (UID: \"5c2425b0-9662-4819-9926-4de7b26ecd00\") " pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:11.861909 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:11.861826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:12.016504 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:12.016469 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-jc8rz"] Apr 19 12:15:12.610533 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:12.610474 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c2425b0_9662_4819_9926_4de7b26ecd00.slice/crio-ca213b19c9a692ece99dd3076e41afcba32d0fff98dbd48d1809a1868891ce92 WatchSource:0}: Error finding container ca213b19c9a692ece99dd3076e41afcba32d0fff98dbd48d1809a1868891ce92: Status 404 returned error can't find the container with id ca213b19c9a692ece99dd3076e41afcba32d0fff98dbd48d1809a1868891ce92 Apr 19 12:15:13.057521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:13.057482 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" event={"ID":"5c2425b0-9662-4819-9926-4de7b26ecd00","Type":"ContainerStarted","Data":"ca213b19c9a692ece99dd3076e41afcba32d0fff98dbd48d1809a1868891ce92"} Apr 19 12:15:13.058991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:13.058965 2568 generic.go:358] "Generic (PLEG): container finished" podID="520bc679-5927-4413-941a-fd1e4fa63b12" containerID="169c89067a2ce5e5bf3e5b6656993f38954660b009933ba6aa02a4c32474b597" exitCode=0 Apr 19 12:15:13.059123 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:13.059005 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" event={"ID":"520bc679-5927-4413-941a-fd1e4fa63b12","Type":"ContainerDied","Data":"169c89067a2ce5e5bf3e5b6656993f38954660b009933ba6aa02a4c32474b597"} Apr 19 12:15:14.064896 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:14.064857 2568 generic.go:358] "Generic (PLEG): container finished" podID="520bc679-5927-4413-941a-fd1e4fa63b12" containerID="af056417b5a9c64b6fba791064327d17f7b17f040924f1a97c6798549697285f" exitCode=0 Apr 19 12:15:14.065311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:14.064917 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" event={"ID":"520bc679-5927-4413-941a-fd1e4fa63b12","Type":"ContainerDied","Data":"af056417b5a9c64b6fba791064327d17f7b17f040924f1a97c6798549697285f"} Apr 19 12:15:15.319535 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.319486 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:15.392775 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.392754 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle\") pod \"520bc679-5927-4413-941a-fd1e4fa63b12\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " Apr 19 12:15:15.392875 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.392821 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util\") pod \"520bc679-5927-4413-941a-fd1e4fa63b12\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " Apr 19 12:15:15.392875 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.392838 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j64zf\" (UniqueName: \"kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf\") pod \"520bc679-5927-4413-941a-fd1e4fa63b12\" (UID: \"520bc679-5927-4413-941a-fd1e4fa63b12\") " Apr 19 12:15:15.393123 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.393101 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle" (OuterVolumeSpecName: "bundle") pod "520bc679-5927-4413-941a-fd1e4fa63b12" (UID: "520bc679-5927-4413-941a-fd1e4fa63b12"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:15.394764 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.394742 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf" (OuterVolumeSpecName: "kube-api-access-j64zf") pod "520bc679-5927-4413-941a-fd1e4fa63b12" (UID: "520bc679-5927-4413-941a-fd1e4fa63b12"). InnerVolumeSpecName "kube-api-access-j64zf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:15:15.428589 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.428566 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util" (OuterVolumeSpecName: "util") pod "520bc679-5927-4413-941a-fd1e4fa63b12" (UID: "520bc679-5927-4413-941a-fd1e4fa63b12"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:15.493873 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.493841 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:15.493873 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.493866 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/520bc679-5927-4413-941a-fd1e4fa63b12-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:15.493873 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:15.493876 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j64zf\" (UniqueName: \"kubernetes.io/projected/520bc679-5927-4413-941a-fd1e4fa63b12-kube-api-access-j64zf\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:16.072751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.072727 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" Apr 19 12:15:16.072751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.072731 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87frvgkp" event={"ID":"520bc679-5927-4413-941a-fd1e4fa63b12","Type":"ContainerDied","Data":"408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c"} Apr 19 12:15:16.072973 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.072764 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="408d2a19f7baf08ac2a4efd5ecbffa48794eac13282ea6cab9902dbe60f8408c" Apr 19 12:15:16.074141 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.074117 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" event={"ID":"5c2425b0-9662-4819-9926-4de7b26ecd00","Type":"ContainerStarted","Data":"af60a4327354abf95824d60ad60c3f32ada84d3229f36d577960eaf11405404f"} Apr 19 12:15:16.074269 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.074254 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:16.088461 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:16.088422 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" podStartSLOduration=2.331151723 podStartE2EDuration="5.088412165s" podCreationTimestamp="2026-04-19 12:15:11 +0000 UTC" firstStartedPulling="2026-04-19 12:15:12.612501834 +0000 UTC m=+331.331325971" lastFinishedPulling="2026-04-19 12:15:15.36976228 +0000 UTC m=+334.088586413" observedRunningTime="2026-04-19 12:15:16.087375737 +0000 UTC m=+334.806199889" watchObservedRunningTime="2026-04-19 12:15:16.088412165 +0000 UTC m=+334.807236320" Apr 19 12:15:18.602738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.602661 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj"] Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.602962 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="pull" Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.602973 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="pull" Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.602986 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="extract" Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.602994 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="extract" Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.603008 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="util" Apr 19 12:15:18.603059 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.603013 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="util" Apr 19 12:15:18.603246 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.603066 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="520bc679-5927-4413-941a-fd1e4fa63b12" containerName="extract" Apr 19 12:15:18.606842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.606826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.608993 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.608974 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-9j6vm\"" Apr 19 12:15:18.609226 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.609184 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:15:18.610138 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.609828 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 12:15:18.615199 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.615169 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj"] Apr 19 12:15:18.719460 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.719426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23b1fea5-f79c-4b92-af32-5017b5e92514-tmp\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.719603 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.719464 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9d7f\" (UniqueName: \"kubernetes.io/projected/23b1fea5-f79c-4b92-af32-5017b5e92514-kube-api-access-l9d7f\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.820663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.820608 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23b1fea5-f79c-4b92-af32-5017b5e92514-tmp\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.820788 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.820675 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9d7f\" (UniqueName: \"kubernetes.io/projected/23b1fea5-f79c-4b92-af32-5017b5e92514-kube-api-access-l9d7f\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.821025 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.821004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23b1fea5-f79c-4b92-af32-5017b5e92514-tmp\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.828425 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.828393 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9d7f\" (UniqueName: \"kubernetes.io/projected/23b1fea5-f79c-4b92-af32-5017b5e92514-kube-api-access-l9d7f\") pod \"openshift-lws-operator-bfc7f696d-pzdhj\" (UID: \"23b1fea5-f79c-4b92-af32-5017b5e92514\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:18.918444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:18.918387 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" Apr 19 12:15:19.045651 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:19.045605 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj"] Apr 19 12:15:19.048049 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:19.048023 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23b1fea5_f79c_4b92_af32_5017b5e92514.slice/crio-477000ba877121a6f39ea47313be007ee3b55b4891e790adfcf781a61b1e0e99 WatchSource:0}: Error finding container 477000ba877121a6f39ea47313be007ee3b55b4891e790adfcf781a61b1e0e99: Status 404 returned error can't find the container with id 477000ba877121a6f39ea47313be007ee3b55b4891e790adfcf781a61b1e0e99 Apr 19 12:15:19.084842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:19.084819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" event={"ID":"23b1fea5-f79c-4b92-af32-5017b5e92514","Type":"ContainerStarted","Data":"477000ba877121a6f39ea47313be007ee3b55b4891e790adfcf781a61b1e0e99"} Apr 19 12:15:21.092526 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:21.092497 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" event={"ID":"23b1fea5-f79c-4b92-af32-5017b5e92514","Type":"ContainerStarted","Data":"25eda3652fa679af4c6ac494f787669380492a306993ba9ca30b921bd305c8a4"} Apr 19 12:15:21.106726 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:21.106677 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-pzdhj" podStartSLOduration=1.140088516 podStartE2EDuration="3.106661502s" podCreationTimestamp="2026-04-19 12:15:18 +0000 UTC" firstStartedPulling="2026-04-19 12:15:19.049679752 +0000 UTC m=+337.768503901" lastFinishedPulling="2026-04-19 12:15:21.016252754 +0000 UTC m=+339.735076887" observedRunningTime="2026-04-19 12:15:21.105431314 +0000 UTC m=+339.824255482" watchObservedRunningTime="2026-04-19 12:15:21.106661502 +0000 UTC m=+339.825485656" Apr 19 12:15:22.079702 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:22.079676 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-jc8rz" Apr 19 12:15:23.047412 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.047376 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-rp6ll"] Apr 19 12:15:23.050731 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.050713 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.053106 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.053084 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-752b2\"" Apr 19 12:15:23.057891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.057868 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rp6ll"] Apr 19 12:15:23.159063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.159012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-bound-sa-token\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.159063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.159067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hn6t\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-kube-api-access-8hn6t\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.259977 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.259944 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-bound-sa-token\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.259977 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.259978 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hn6t\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-kube-api-access-8hn6t\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.268177 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.268152 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-bound-sa-token\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.268348 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.268330 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hn6t\" (UniqueName: \"kubernetes.io/projected/d2780998-0a5f-4462-804a-4da9d5faa5cf-kube-api-access-8hn6t\") pod \"cert-manager-759f64656b-rp6ll\" (UID: \"d2780998-0a5f-4462-804a-4da9d5faa5cf\") " pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.361082 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.361009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rp6ll" Apr 19 12:15:23.477575 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.477550 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rp6ll"] Apr 19 12:15:23.479658 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:23.479613 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2780998_0a5f_4462_804a_4da9d5faa5cf.slice/crio-b4edee444437d75fa6b715cab19c8f3327960d76df0642c9e747dcc5b79f045b WatchSource:0}: Error finding container b4edee444437d75fa6b715cab19c8f3327960d76df0642c9e747dcc5b79f045b: Status 404 returned error can't find the container with id b4edee444437d75fa6b715cab19c8f3327960d76df0642c9e747dcc5b79f045b Apr 19 12:15:23.617371 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.617289 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp"] Apr 19 12:15:23.620907 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.620891 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.623015 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.622989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:15:23.623135 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.623120 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:15:23.623273 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.623258 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:15:23.628105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.628080 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp"] Apr 19 12:15:23.764069 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.764031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skd6x\" (UniqueName: \"kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.764255 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.764093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.764255 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.764221 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.865524 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.865489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.865740 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.865541 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skd6x\" (UniqueName: \"kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.865740 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.865598 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.865972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.865951 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.866006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.865971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.876135 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.876074 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skd6x\" (UniqueName: \"kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:23.931069 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:23.931029 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:24.047874 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:24.047728 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp"] Apr 19 12:15:24.049987 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:24.049959 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37e9655_ef18_4969_a786_2d210d90bca1.slice/crio-37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215 WatchSource:0}: Error finding container 37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215: Status 404 returned error can't find the container with id 37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215 Apr 19 12:15:24.102705 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:24.102678 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rp6ll" event={"ID":"d2780998-0a5f-4462-804a-4da9d5faa5cf","Type":"ContainerStarted","Data":"72816cfba2019e48edb1fa50a7997283db91b359e8e13cf123464707de30ecd9"} Apr 19 12:15:24.102809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:24.102714 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rp6ll" event={"ID":"d2780998-0a5f-4462-804a-4da9d5faa5cf","Type":"ContainerStarted","Data":"b4edee444437d75fa6b715cab19c8f3327960d76df0642c9e747dcc5b79f045b"} Apr 19 12:15:24.104281 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:24.104259 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" event={"ID":"f37e9655-ef18-4969-a786-2d210d90bca1","Type":"ContainerStarted","Data":"37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215"} Apr 19 12:15:24.116961 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:24.116922 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-rp6ll" podStartSLOduration=1.116908042 podStartE2EDuration="1.116908042s" podCreationTimestamp="2026-04-19 12:15:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:15:24.115461053 +0000 UTC m=+342.834285208" watchObservedRunningTime="2026-04-19 12:15:24.116908042 +0000 UTC m=+342.835732196" Apr 19 12:15:25.109044 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:25.108963 2568 generic.go:358] "Generic (PLEG): container finished" podID="f37e9655-ef18-4969-a786-2d210d90bca1" containerID="0e7a5833ae07f35597630d75125bfd67ff2b94878294f5df24c6cda22429c3eb" exitCode=0 Apr 19 12:15:25.109450 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:25.109054 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" event={"ID":"f37e9655-ef18-4969-a786-2d210d90bca1","Type":"ContainerDied","Data":"0e7a5833ae07f35597630d75125bfd67ff2b94878294f5df24c6cda22429c3eb"} Apr 19 12:15:26.113440 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:26.113358 2568 generic.go:358] "Generic (PLEG): container finished" podID="f37e9655-ef18-4969-a786-2d210d90bca1" containerID="02634f6de6da17e03df4d3dd93a7fc1a5908b40847a519c42dab31369425b313" exitCode=0 Apr 19 12:15:26.113789 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:26.113448 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" event={"ID":"f37e9655-ef18-4969-a786-2d210d90bca1","Type":"ContainerDied","Data":"02634f6de6da17e03df4d3dd93a7fc1a5908b40847a519c42dab31369425b313"} Apr 19 12:15:27.117946 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:27.117913 2568 generic.go:358] "Generic (PLEG): container finished" podID="f37e9655-ef18-4969-a786-2d210d90bca1" containerID="b10de31788da801916f8409590376f20653cea467cd33257f01bd2cfb94f7bdd" exitCode=0 Apr 19 12:15:27.118299 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:27.118002 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" event={"ID":"f37e9655-ef18-4969-a786-2d210d90bca1","Type":"ContainerDied","Data":"b10de31788da801916f8409590376f20653cea467cd33257f01bd2cfb94f7bdd"} Apr 19 12:15:28.238034 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.238013 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:28.402879 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.402811 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle\") pod \"f37e9655-ef18-4969-a786-2d210d90bca1\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " Apr 19 12:15:28.402879 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.402861 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skd6x\" (UniqueName: \"kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x\") pod \"f37e9655-ef18-4969-a786-2d210d90bca1\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " Apr 19 12:15:28.403075 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.402880 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util\") pod \"f37e9655-ef18-4969-a786-2d210d90bca1\" (UID: \"f37e9655-ef18-4969-a786-2d210d90bca1\") " Apr 19 12:15:28.403606 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.403574 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle" (OuterVolumeSpecName: "bundle") pod "f37e9655-ef18-4969-a786-2d210d90bca1" (UID: "f37e9655-ef18-4969-a786-2d210d90bca1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:28.404913 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.404885 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x" (OuterVolumeSpecName: "kube-api-access-skd6x") pod "f37e9655-ef18-4969-a786-2d210d90bca1" (UID: "f37e9655-ef18-4969-a786-2d210d90bca1"). InnerVolumeSpecName "kube-api-access-skd6x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:15:28.407788 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.407751 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util" (OuterVolumeSpecName: "util") pod "f37e9655-ef18-4969-a786-2d210d90bca1" (UID: "f37e9655-ef18-4969-a786-2d210d90bca1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:28.504381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.504352 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:28.504381 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.504378 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-skd6x\" (UniqueName: \"kubernetes.io/projected/f37e9655-ef18-4969-a786-2d210d90bca1-kube-api-access-skd6x\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:28.504536 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:28.504389 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f37e9655-ef18-4969-a786-2d210d90bca1-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:29.126264 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:29.126234 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" Apr 19 12:15:29.126497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:29.126234 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5wjjqp" event={"ID":"f37e9655-ef18-4969-a786-2d210d90bca1","Type":"ContainerDied","Data":"37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215"} Apr 19 12:15:29.126497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:29.126343 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37afcab1b0de4ea56289e71114ec4d440b61d9e6ac5055715400457179f1e215" Apr 19 12:15:36.225290 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225257 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj"] Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225587 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="util" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225598 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="util" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225608 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="pull" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225616 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="pull" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225641 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="extract" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225647 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="extract" Apr 19 12:15:36.225744 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.225703 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f37e9655-ef18-4969-a786-2d210d90bca1" containerName="extract" Apr 19 12:15:36.230167 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.230149 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.232369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.232352 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:15:36.233351 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.233333 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:15:36.233502 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.233405 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:15:36.235903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.235883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj"] Apr 19 12:15:36.369276 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.369244 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.369422 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.369296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqbrh\" (UniqueName: \"kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.369422 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.369369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.470722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.470690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.470903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.470745 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqbrh\" (UniqueName: \"kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.470903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.470797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.471149 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.471127 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.471224 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.471170 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.478052 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.477995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqbrh\" (UniqueName: \"kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.541091 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.541067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:36.660571 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:36.660547 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj"] Apr 19 12:15:36.662847 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:36.662817 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9256b97_8a1c_4695_b291_1b37dd2d2c56.slice/crio-d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f WatchSource:0}: Error finding container d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f: Status 404 returned error can't find the container with id d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f Apr 19 12:15:37.154456 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:37.154364 2568 generic.go:358] "Generic (PLEG): container finished" podID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerID="2e0695c1ea8efec83a9ef77661a376bfaf32a12e365e25f3a850621275ca1aa0" exitCode=0 Apr 19 12:15:37.154612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:37.154452 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" event={"ID":"f9256b97-8a1c-4695-b291-1b37dd2d2c56","Type":"ContainerDied","Data":"2e0695c1ea8efec83a9ef77661a376bfaf32a12e365e25f3a850621275ca1aa0"} Apr 19 12:15:37.154612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:37.154491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" event={"ID":"f9256b97-8a1c-4695-b291-1b37dd2d2c56","Type":"ContainerStarted","Data":"d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f"} Apr 19 12:15:38.159877 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.159789 2568 generic.go:358] "Generic (PLEG): container finished" podID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerID="216249f0da2c9ac67a400954d6c82e244a8c2b008a62fbfedbedacadb71614e9" exitCode=0 Apr 19 12:15:38.160211 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.159879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" event={"ID":"f9256b97-8a1c-4695-b291-1b37dd2d2c56","Type":"ContainerDied","Data":"216249f0da2c9ac67a400954d6c82e244a8c2b008a62fbfedbedacadb71614e9"} Apr 19 12:15:38.368861 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.368835 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn"] Apr 19 12:15:38.372009 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.371993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.375441 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.375385 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:15:38.375619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.375606 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-66xtk\"" Apr 19 12:15:38.376189 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.376174 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:15:38.376546 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.376531 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:15:38.377994 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.377978 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:15:38.386016 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.385997 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn"] Apr 19 12:15:38.492567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.488388 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.492567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.488458 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbsp\" (UniqueName: \"kubernetes.io/projected/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-kube-api-access-trbsp\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.492567 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.488553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.589649 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.589588 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.589848 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.589745 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trbsp\" (UniqueName: \"kubernetes.io/projected/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-kube-api-access-trbsp\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.589848 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.589812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.592215 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.592185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.592323 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.592216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.603425 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.603400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbsp\" (UniqueName: \"kubernetes.io/projected/bf8bd34d-5a0d-40e3-b837-24ba4ae7e215-kube-api-access-trbsp\") pod \"opendatahub-operator-controller-manager-676bcb86f4-hhwxn\" (UID: \"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.681416 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.681378 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:38.801483 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:38.801442 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn"] Apr 19 12:15:38.805673 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:38.805615 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8bd34d_5a0d_40e3_b837_24ba4ae7e215.slice/crio-0cf4d0ddd1f660d2f8c7ea38d035cf8bd70f36a5d813d8a1e157d3b16d9c78af WatchSource:0}: Error finding container 0cf4d0ddd1f660d2f8c7ea38d035cf8bd70f36a5d813d8a1e157d3b16d9c78af: Status 404 returned error can't find the container with id 0cf4d0ddd1f660d2f8c7ea38d035cf8bd70f36a5d813d8a1e157d3b16d9c78af Apr 19 12:15:39.164014 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:39.163927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" event={"ID":"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215","Type":"ContainerStarted","Data":"0cf4d0ddd1f660d2f8c7ea38d035cf8bd70f36a5d813d8a1e157d3b16d9c78af"} Apr 19 12:15:39.165751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:39.165727 2568 generic.go:358] "Generic (PLEG): container finished" podID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerID="96f4f85fa79456c9eb8b9b1c11d12e2b6c9927e0e93738fe186d9974a8cfdd16" exitCode=0 Apr 19 12:15:39.165866 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:39.165769 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" event={"ID":"f9256b97-8a1c-4695-b291-1b37dd2d2c56","Type":"ContainerDied","Data":"96f4f85fa79456c9eb8b9b1c11d12e2b6c9927e0e93738fe186d9974a8cfdd16"} Apr 19 12:15:40.312766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.312744 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:40.403238 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.403208 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util\") pod \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " Apr 19 12:15:40.403392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.403258 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle\") pod \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " Apr 19 12:15:40.403392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.403323 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqbrh\" (UniqueName: \"kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh\") pod \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\" (UID: \"f9256b97-8a1c-4695-b291-1b37dd2d2c56\") " Apr 19 12:15:40.404994 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.404957 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle" (OuterVolumeSpecName: "bundle") pod "f9256b97-8a1c-4695-b291-1b37dd2d2c56" (UID: "f9256b97-8a1c-4695-b291-1b37dd2d2c56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:40.406574 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.406519 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh" (OuterVolumeSpecName: "kube-api-access-nqbrh") pod "f9256b97-8a1c-4695-b291-1b37dd2d2c56" (UID: "f9256b97-8a1c-4695-b291-1b37dd2d2c56"). InnerVolumeSpecName "kube-api-access-nqbrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:15:40.410999 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.410960 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util" (OuterVolumeSpecName: "util") pod "f9256b97-8a1c-4695-b291-1b37dd2d2c56" (UID: "f9256b97-8a1c-4695-b291-1b37dd2d2c56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:40.504171 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.504141 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:40.504171 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.504169 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9256b97-8a1c-4695-b291-1b37dd2d2c56-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:40.504171 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:40.504179 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqbrh\" (UniqueName: \"kubernetes.io/projected/f9256b97-8a1c-4695-b291-1b37dd2d2c56-kube-api-access-nqbrh\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:41.176477 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:41.176451 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" Apr 19 12:15:41.176660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:41.176445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9b5lgj" event={"ID":"f9256b97-8a1c-4695-b291-1b37dd2d2c56","Type":"ContainerDied","Data":"d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f"} Apr 19 12:15:41.176660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:41.176598 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9a59aa1b0050ce05f7fc68f0674d30395c0b9b1eafdfedacbbcb1f2b46b5a2f" Apr 19 12:15:42.181031 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:42.180992 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" event={"ID":"bf8bd34d-5a0d-40e3-b837-24ba4ae7e215","Type":"ContainerStarted","Data":"138cfe2cb9f2b71fe63e9b72b720ed7f11a8eba022fbcbfd946191db058b5c9d"} Apr 19 12:15:42.181510 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:42.181142 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:42.203829 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:42.203784 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" podStartSLOduration=1.325667693 podStartE2EDuration="4.203769669s" podCreationTimestamp="2026-04-19 12:15:38 +0000 UTC" firstStartedPulling="2026-04-19 12:15:38.807944246 +0000 UTC m=+357.526768384" lastFinishedPulling="2026-04-19 12:15:41.686046224 +0000 UTC m=+360.404870360" observedRunningTime="2026-04-19 12:15:42.20168727 +0000 UTC m=+360.920511423" watchObservedRunningTime="2026-04-19 12:15:42.203769669 +0000 UTC m=+360.922593823" Apr 19 12:15:53.186727 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:53.186698 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-hhwxn" Apr 19 12:15:55.382434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382391 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh"] Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382767 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="util" Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382778 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="util" Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382790 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="extract" Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382796 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="extract" Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382810 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="pull" Apr 19 12:15:55.382836 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382815 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="pull" Apr 19 12:15:55.383023 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.382868 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9256b97-8a1c-4695-b291-1b37dd2d2c56" containerName="extract" Apr 19 12:15:55.385722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.385706 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.388064 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.388040 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:15:55.388064 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.388047 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:15:55.388252 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.388088 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:15:55.394160 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.394136 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh"] Apr 19 12:15:55.426669 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.426613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.426804 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.426679 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.426804 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.426708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lx46\" (UniqueName: \"kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.527800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.527762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.527800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.527800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.528065 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.527827 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lx46\" (UniqueName: \"kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.528225 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.528202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.528289 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.528264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.536032 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.535998 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lx46\" (UniqueName: \"kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.696307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.696261 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:55.820984 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.820961 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh"] Apr 19 12:15:55.823237 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:55.823207 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba151cd1_fcf8_4aaa_86b9_8e1bc5f44c03.slice/crio-f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9 WatchSource:0}: Error finding container f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9: Status 404 returned error can't find the container with id f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9 Apr 19 12:15:55.926430 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.926403 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6g4hh"] Apr 19 12:15:55.929524 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.929506 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:55.931947 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.931923 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 19 12:15:55.932040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.931922 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9qgz6\"" Apr 19 12:15:55.932040 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.931922 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 19 12:15:55.939405 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:55.939381 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6g4hh"] Apr 19 12:15:56.032031 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.031995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e633d622-52e6-40a4-aef0-84f7a013542b-tmp\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.032031 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.032032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrm7\" (UniqueName: \"kubernetes.io/projected/e633d622-52e6-40a4-aef0-84f7a013542b-kube-api-access-qzrm7\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.032240 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.032059 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e633d622-52e6-40a4-aef0-84f7a013542b-tls-certs\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.133131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.133096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e633d622-52e6-40a4-aef0-84f7a013542b-tmp\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.133131 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.133133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrm7\" (UniqueName: \"kubernetes.io/projected/e633d622-52e6-40a4-aef0-84f7a013542b-kube-api-access-qzrm7\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.133354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.133257 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e633d622-52e6-40a4-aef0-84f7a013542b-tls-certs\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.135371 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.135345 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e633d622-52e6-40a4-aef0-84f7a013542b-tmp\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.135568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.135551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e633d622-52e6-40a4-aef0-84f7a013542b-tls-certs\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.139793 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.139775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrm7\" (UniqueName: \"kubernetes.io/projected/e633d622-52e6-40a4-aef0-84f7a013542b-kube-api-access-qzrm7\") pod \"kube-auth-proxy-58497579d8-6g4hh\" (UID: \"e633d622-52e6-40a4-aef0-84f7a013542b\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.228813 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.228736 2568 generic.go:358] "Generic (PLEG): container finished" podID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerID="d35c6515bbae4685722f12bb7cb4a24a2b3f0db0fd81343c6328d0f1ae01b6e2" exitCode=0 Apr 19 12:15:56.228939 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.228816 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" event={"ID":"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03","Type":"ContainerDied","Data":"d35c6515bbae4685722f12bb7cb4a24a2b3f0db0fd81343c6328d0f1ae01b6e2"} Apr 19 12:15:56.228939 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.228850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" event={"ID":"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03","Type":"ContainerStarted","Data":"f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9"} Apr 19 12:15:56.276143 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.276120 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" Apr 19 12:15:56.392425 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:56.392401 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6g4hh"] Apr 19 12:15:56.394669 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:15:56.394637 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode633d622_52e6_40a4_aef0_84f7a013542b.slice/crio-3112b12ad8f5a72339ef4ecfbe0e1314fdb8272e5eecaebec0245e7536a703e2 WatchSource:0}: Error finding container 3112b12ad8f5a72339ef4ecfbe0e1314fdb8272e5eecaebec0245e7536a703e2: Status 404 returned error can't find the container with id 3112b12ad8f5a72339ef4ecfbe0e1314fdb8272e5eecaebec0245e7536a703e2 Apr 19 12:15:57.235140 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:57.235100 2568 generic.go:358] "Generic (PLEG): container finished" podID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerID="6cbcb3d747ecb8f61a3ef398123974079e9cb787c246bd5d83ac6d7c52697d52" exitCode=0 Apr 19 12:15:57.235319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:57.235230 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" event={"ID":"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03","Type":"ContainerDied","Data":"6cbcb3d747ecb8f61a3ef398123974079e9cb787c246bd5d83ac6d7c52697d52"} Apr 19 12:15:57.236980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:57.236925 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" event={"ID":"e633d622-52e6-40a4-aef0-84f7a013542b","Type":"ContainerStarted","Data":"3112b12ad8f5a72339ef4ecfbe0e1314fdb8272e5eecaebec0245e7536a703e2"} Apr 19 12:15:58.248689 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:58.248651 2568 generic.go:358] "Generic (PLEG): container finished" podID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerID="ac1fb91b754712df717157104b4632deb0a375486b464a07b1ce78594c948ab1" exitCode=0 Apr 19 12:15:58.249108 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:58.248704 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" event={"ID":"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03","Type":"ContainerDied","Data":"ac1fb91b754712df717157104b4632deb0a375486b464a07b1ce78594c948ab1"} Apr 19 12:15:59.671083 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.671059 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:15:59.764673 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.764540 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util\") pod \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " Apr 19 12:15:59.764673 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.764587 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle\") pod \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " Apr 19 12:15:59.764673 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.764657 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lx46\" (UniqueName: \"kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46\") pod \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\" (UID: \"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03\") " Apr 19 12:15:59.765948 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.765922 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle" (OuterVolumeSpecName: "bundle") pod "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" (UID: "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:59.767682 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.767640 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46" (OuterVolumeSpecName: "kube-api-access-2lx46") pod "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" (UID: "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03"). InnerVolumeSpecName "kube-api-access-2lx46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:15:59.772042 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.772013 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util" (OuterVolumeSpecName: "util") pod "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" (UID: "ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:15:59.865685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.865654 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:59.865685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.865678 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:15:59.865685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:15:59.865688 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lx46\" (UniqueName: \"kubernetes.io/projected/ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03-kube-api-access-2lx46\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:16:00.260920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:00.260890 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" Apr 19 12:16:00.261094 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:00.260897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835mj6bh" event={"ID":"ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03","Type":"ContainerDied","Data":"f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9"} Apr 19 12:16:00.261094 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:00.261004 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7659bb90bea1fcc395a27d0705dcf5df55c52f84fef464f4a949bb9469a03e9" Apr 19 12:16:00.262331 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:00.262302 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" event={"ID":"e633d622-52e6-40a4-aef0-84f7a013542b","Type":"ContainerStarted","Data":"a63b20f09a6936a30e0868b6db819e603d5a77cd090cf6da03d138f97dc2dce9"} Apr 19 12:16:00.278114 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:00.278067 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-58497579d8-6g4hh" podStartSLOduration=1.963099436 podStartE2EDuration="5.278055862s" podCreationTimestamp="2026-04-19 12:15:55 +0000 UTC" firstStartedPulling="2026-04-19 12:15:56.396316391 +0000 UTC m=+375.115140524" lastFinishedPulling="2026-04-19 12:15:59.711272802 +0000 UTC m=+378.430096950" observedRunningTime="2026-04-19 12:16:00.277304165 +0000 UTC m=+378.996128321" watchObservedRunningTime="2026-04-19 12:16:00.278055862 +0000 UTC m=+378.996880017" Apr 19 12:16:06.334776 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.334740 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mxx94"] Apr 19 12:16:06.335270 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335252 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="util" Apr 19 12:16:06.335335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335272 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="util" Apr 19 12:16:06.335335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335300 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="pull" Apr 19 12:16:06.335335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335309 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="pull" Apr 19 12:16:06.335335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335324 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="extract" Apr 19 12:16:06.335335 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335333 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="extract" Apr 19 12:16:06.335570 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.335422 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba151cd1-fcf8-4aaa-86b9-8e1bc5f44c03" containerName="extract" Apr 19 12:16:06.338680 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.338659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:06.342101 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.342083 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 19 12:16:06.343329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.343306 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-8fkbm\"" Apr 19 12:16:06.350231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.350211 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mxx94"] Apr 19 12:16:06.418050 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.418022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:06.418171 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.418090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvcc\" (UniqueName: \"kubernetes.io/projected/33648a89-a48d-4908-9aa7-b933d1e02e8c-kube-api-access-2zvcc\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:06.518472 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.518446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:06.518620 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.518538 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zvcc\" (UniqueName: \"kubernetes.io/projected/33648a89-a48d-4908-9aa7-b933d1e02e8c-kube-api-access-2zvcc\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:06.518685 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:16:06.518618 2568 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 19 12:16:06.518737 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:16:06.518711 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert podName:33648a89-a48d-4908-9aa7-b933d1e02e8c nodeName:}" failed. No retries permitted until 2026-04-19 12:16:07.018689907 +0000 UTC m=+385.737514050 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert") pod "kserve-controller-manager-856948b99f-mxx94" (UID: "33648a89-a48d-4908-9aa7-b933d1e02e8c") : secret "kserve-webhook-server-cert" not found Apr 19 12:16:06.530532 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:06.530501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zvcc\" (UniqueName: \"kubernetes.io/projected/33648a89-a48d-4908-9aa7-b933d1e02e8c-kube-api-access-2zvcc\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:07.024392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:07.024360 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:07.026722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:07.026703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33648a89-a48d-4908-9aa7-b933d1e02e8c-cert\") pod \"kserve-controller-manager-856948b99f-mxx94\" (UID: \"33648a89-a48d-4908-9aa7-b933d1e02e8c\") " pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:07.250671 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:07.250614 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:07.365684 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:07.365661 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mxx94"] Apr 19 12:16:07.367516 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:16:07.367484 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33648a89_a48d_4908_9aa7_b933d1e02e8c.slice/crio-57cc48479043a645dff8d91c51a8ab48e49307547bfa26245454c88398467941 WatchSource:0}: Error finding container 57cc48479043a645dff8d91c51a8ab48e49307547bfa26245454c88398467941: Status 404 returned error can't find the container with id 57cc48479043a645dff8d91c51a8ab48e49307547bfa26245454c88398467941 Apr 19 12:16:08.297384 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:08.297345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" event={"ID":"33648a89-a48d-4908-9aa7-b933d1e02e8c","Type":"ContainerStarted","Data":"57cc48479043a645dff8d91c51a8ab48e49307547bfa26245454c88398467941"} Apr 19 12:16:09.339128 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.339095 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52"] Apr 19 12:16:09.344473 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.344450 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.346962 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.346936 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-r2m55\"" Apr 19 12:16:09.347292 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.347270 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 19 12:16:09.347372 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.347276 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 19 12:16:09.353195 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.353167 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52"] Apr 19 12:16:09.445246 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.445200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz4bk\" (UniqueName: \"kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.445428 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.445269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.445428 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.445346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.546738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.546696 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hz4bk\" (UniqueName: \"kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.546899 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.546791 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.546899 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.546833 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.547290 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.547256 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.547388 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.547300 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.557955 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.557926 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz4bk\" (UniqueName: \"kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.662745 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.662665 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:09.853291 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:09.853251 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52"] Apr 19 12:16:09.855704 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:16:09.855676 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d9e16d_c5ff_4ba7_88cd_c3ec9478a1bf.slice/crio-bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515 WatchSource:0}: Error finding container bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515: Status 404 returned error can't find the container with id bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515 Apr 19 12:16:10.305498 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.305459 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" event={"ID":"33648a89-a48d-4908-9aa7-b933d1e02e8c","Type":"ContainerStarted","Data":"f10b634d6c99868b32567241deddc7068e8aba675659fd7124ab4c5c970bcf20"} Apr 19 12:16:10.305716 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.305561 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:16:10.306865 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.306843 2568 generic.go:358] "Generic (PLEG): container finished" podID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerID="0e271e6a89d73293e886d14f66c8b7d6ace343de66dcb20e09265a7b62ec732b" exitCode=0 Apr 19 12:16:10.306983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.306926 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" event={"ID":"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf","Type":"ContainerDied","Data":"0e271e6a89d73293e886d14f66c8b7d6ace343de66dcb20e09265a7b62ec732b"} Apr 19 12:16:10.306983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.306958 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" event={"ID":"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf","Type":"ContainerStarted","Data":"bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515"} Apr 19 12:16:10.325683 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.325617 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" podStartSLOduration=1.9119413600000001 podStartE2EDuration="4.325605516s" podCreationTimestamp="2026-04-19 12:16:06 +0000 UTC" firstStartedPulling="2026-04-19 12:16:07.368720287 +0000 UTC m=+386.087544422" lastFinishedPulling="2026-04-19 12:16:09.782384426 +0000 UTC m=+388.501208578" observedRunningTime="2026-04-19 12:16:10.324123111 +0000 UTC m=+389.042947277" watchObservedRunningTime="2026-04-19 12:16:10.325605516 +0000 UTC m=+389.044429671" Apr 19 12:16:10.529268 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.529236 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn"] Apr 19 12:16:10.532572 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.532556 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.535519 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.535499 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-wtbj4\"" Apr 19 12:16:10.535670 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.535652 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 19 12:16:10.535952 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.535941 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 19 12:16:10.543810 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.543789 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn"] Apr 19 12:16:10.656191 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.656127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/635ceb29-3be3-43cb-b0c6-784cd9393bf0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.656191 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.656166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9td2z\" (UniqueName: \"kubernetes.io/projected/635ceb29-3be3-43cb-b0c6-784cd9393bf0-kube-api-access-9td2z\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.757702 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.757664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/635ceb29-3be3-43cb-b0c6-784cd9393bf0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.757891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.757720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9td2z\" (UniqueName: \"kubernetes.io/projected/635ceb29-3be3-43cb-b0c6-784cd9393bf0-kube-api-access-9td2z\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.760310 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.760284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/635ceb29-3be3-43cb-b0c6-784cd9393bf0-operator-config\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.765230 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.765213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9td2z\" (UniqueName: \"kubernetes.io/projected/635ceb29-3be3-43cb-b0c6-784cd9393bf0-kube-api-access-9td2z\") pod \"servicemesh-operator3-55f49c5f94-bfnrn\" (UID: \"635ceb29-3be3-43cb-b0c6-784cd9393bf0\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.842263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.842224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:10.960844 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:10.960814 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn"] Apr 19 12:16:10.964294 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:16:10.964263 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635ceb29_3be3_43cb_b0c6_784cd9393bf0.slice/crio-35dbe72ed63014fd834d2a246888325cc6954c49c813329f9a91801d7ff3a9ea WatchSource:0}: Error finding container 35dbe72ed63014fd834d2a246888325cc6954c49c813329f9a91801d7ff3a9ea: Status 404 returned error can't find the container with id 35dbe72ed63014fd834d2a246888325cc6954c49c813329f9a91801d7ff3a9ea Apr 19 12:16:11.311459 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:11.311427 2568 generic.go:358] "Generic (PLEG): container finished" podID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerID="22aa99d4ebcff8182d2a3e3d76d8dc7818748dc6e042c2957ada02615dc4139e" exitCode=0 Apr 19 12:16:11.311610 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:11.311507 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" event={"ID":"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf","Type":"ContainerDied","Data":"22aa99d4ebcff8182d2a3e3d76d8dc7818748dc6e042c2957ada02615dc4139e"} Apr 19 12:16:11.312892 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:11.312864 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" event={"ID":"635ceb29-3be3-43cb-b0c6-784cd9393bf0","Type":"ContainerStarted","Data":"35dbe72ed63014fd834d2a246888325cc6954c49c813329f9a91801d7ff3a9ea"} Apr 19 12:16:12.319445 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:12.319413 2568 generic.go:358] "Generic (PLEG): container finished" podID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerID="33b1d1cb35d24de737cbe9589d942f61cdc1fc032e30f88a6c06e6e1f04e1cce" exitCode=0 Apr 19 12:16:12.319869 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:12.319485 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" event={"ID":"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf","Type":"ContainerDied","Data":"33b1d1cb35d24de737cbe9589d942f61cdc1fc032e30f88a6c06e6e1f04e1cce"} Apr 19 12:16:13.502032 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.502010 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:13.581448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.581399 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz4bk\" (UniqueName: \"kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk\") pod \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " Apr 19 12:16:13.581528 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.581486 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util\") pod \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " Apr 19 12:16:13.581562 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.581547 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle\") pod \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\" (UID: \"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf\") " Apr 19 12:16:13.582663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.582619 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle" (OuterVolumeSpecName: "bundle") pod "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" (UID: "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:13.583533 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.583503 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk" (OuterVolumeSpecName: "kube-api-access-hz4bk") pod "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" (UID: "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf"). InnerVolumeSpecName "kube-api-access-hz4bk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:16:13.586854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.586832 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util" (OuterVolumeSpecName: "util") pod "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" (UID: "46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:16:13.682743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.682705 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:16:13.682743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.682744 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hz4bk\" (UniqueName: \"kubernetes.io/projected/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-kube-api-access-hz4bk\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:16:13.682938 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:13.682760 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:16:14.333889 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.333856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" event={"ID":"635ceb29-3be3-43cb-b0c6-784cd9393bf0","Type":"ContainerStarted","Data":"cea215ab0e0d01d99afa101e9614ed725ca91a73ce1117ad21e1b7e6d7fed7a9"} Apr 19 12:16:14.334055 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.333941 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:14.335602 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.335579 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" event={"ID":"46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf","Type":"ContainerDied","Data":"bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515"} Apr 19 12:16:14.335739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.335606 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vbf52" Apr 19 12:16:14.335739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.335603 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3f7e7efacd199a1a061b3c1a59e12af6588a3ffead058c3b0442fb05344515" Apr 19 12:16:14.353606 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:14.353567 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" podStartSLOduration=1.773170845 podStartE2EDuration="4.353557706s" podCreationTimestamp="2026-04-19 12:16:10 +0000 UTC" firstStartedPulling="2026-04-19 12:16:10.966743873 +0000 UTC m=+389.685568009" lastFinishedPulling="2026-04-19 12:16:13.547130737 +0000 UTC m=+392.265954870" observedRunningTime="2026-04-19 12:16:14.352145801 +0000 UTC m=+393.070969957" watchObservedRunningTime="2026-04-19 12:16:14.353557706 +0000 UTC m=+393.072381860" Apr 19 12:16:18.911146 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911080 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs"] Apr 19 12:16:18.911614 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911592 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="pull" Apr 19 12:16:18.911699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911618 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="pull" Apr 19 12:16:18.911699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911656 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="util" Apr 19 12:16:18.911699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911662 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="util" Apr 19 12:16:18.911699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911676 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="extract" Apr 19 12:16:18.911699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911682 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="extract" Apr 19 12:16:18.911991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.911733 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="46d9e16d-c5ff-4ba7-88cd-c3ec9478a1bf" containerName="extract" Apr 19 12:16:18.920689 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.920667 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:18.922022 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.922000 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs"] Apr 19 12:16:18.922972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.922929 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 19 12:16:18.922972 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.922953 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 19 12:16:18.923136 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.923038 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 19 12:16:18.923136 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.923116 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-8zsbv\"" Apr 19 12:16:18.923249 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:18.923229 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 19 12:16:19.030514 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030514 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030516 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk8qt\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-kube-api-access-qk8qt\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030694 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/93664af5-2722-48d4-b948-7a32e4d3c11e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.030842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.030744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.131853 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.131818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/93664af5-2722-48d4-b948-7a32e4d3c11e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.131853 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.131855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.131877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.131935 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132120 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.132097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132170 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.132138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qk8qt\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-kube-api-access-qk8qt\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132284 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.132263 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.132715 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.132691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.134219 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.134193 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/93664af5-2722-48d4-b948-7a32e4d3c11e-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.134382 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.134363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.134495 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.134474 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.134667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.134649 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.139150 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.139122 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.139263 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.139246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk8qt\" (UniqueName: \"kubernetes.io/projected/93664af5-2722-48d4-b948-7a32e4d3c11e-kube-api-access-qk8qt\") pod \"istiod-openshift-gateway-55ff986f96-8tpgs\" (UID: \"93664af5-2722-48d4-b948-7a32e4d3c11e\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.230582 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.230544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:19.351805 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:19.351779 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs"] Apr 19 12:16:19.354217 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:16:19.354162 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93664af5_2722_48d4_b948_7a32e4d3c11e.slice/crio-46f40b19d79084978b3621d836da505b14abf5834d05d3c5dfd5b10597d4e023 WatchSource:0}: Error finding container 46f40b19d79084978b3621d836da505b14abf5834d05d3c5dfd5b10597d4e023: Status 404 returned error can't find the container with id 46f40b19d79084978b3621d836da505b14abf5834d05d3c5dfd5b10597d4e023 Apr 19 12:16:20.361905 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:20.361865 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" event={"ID":"93664af5-2722-48d4-b948-7a32e4d3c11e","Type":"ContainerStarted","Data":"46f40b19d79084978b3621d836da505b14abf5834d05d3c5dfd5b10597d4e023"} Apr 19 12:16:21.777005 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:21.776947 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 19 12:16:21.777368 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:21.777046 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 19 12:16:22.373427 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:22.373391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" event={"ID":"93664af5-2722-48d4-b948-7a32e4d3c11e","Type":"ContainerStarted","Data":"8a26bc3c57f1647104ab304c2aeab969a30bfe2d7fa21ed351e8d786a596477d"} Apr 19 12:16:22.373739 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:22.373708 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:22.375280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:22.375253 2568 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-8tpgs container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 19 12:16:22.375409 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:22.375309 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" podUID="93664af5-2722-48d4-b948-7a32e4d3c11e" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:16:22.394760 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:22.394717 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" podStartSLOduration=1.9740169600000002 podStartE2EDuration="4.394701362s" podCreationTimestamp="2026-04-19 12:16:18 +0000 UTC" firstStartedPulling="2026-04-19 12:16:19.355983804 +0000 UTC m=+398.074807936" lastFinishedPulling="2026-04-19 12:16:21.776668192 +0000 UTC m=+400.495492338" observedRunningTime="2026-04-19 12:16:22.393862267 +0000 UTC m=+401.112686424" watchObservedRunningTime="2026-04-19 12:16:22.394701362 +0000 UTC m=+401.113525517" Apr 19 12:16:23.377540 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:23.377509 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8tpgs" Apr 19 12:16:25.343062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:25.343030 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-bfnrn" Apr 19 12:16:41.318880 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:16:41.318842 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-mxx94" Apr 19 12:17:08.383391 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.383319 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb"] Apr 19 12:17:08.386966 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.386948 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.389284 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.389264 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6x9x6\"" Apr 19 12:17:08.389284 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.389276 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:17:08.390304 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.390282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:17:08.392916 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.392897 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb"] Apr 19 12:17:08.457763 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.457735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.457915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.457775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49mrm\" (UniqueName: \"kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.457915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.457813 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.558957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.558921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49mrm\" (UniqueName: \"kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.559139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.558983 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.559139 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.559058 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.559392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.559370 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.559459 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.559400 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.566443 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.566419 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49mrm\" (UniqueName: \"kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.697105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.697080 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:08.815018 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.814992 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb"] Apr 19 12:17:08.816742 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:08.816713 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1974cf29_120a_4b71_9c55_8896b6d353a9.slice/crio-03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a WatchSource:0}: Error finding container 03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a: Status 404 returned error can't find the container with id 03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a Apr 19 12:17:08.986857 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.986791 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf"] Apr 19 12:17:08.990495 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.990481 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:08.996650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:08.996606 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf"] Apr 19 12:17:09.062483 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.062455 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.062483 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.062483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.062686 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.062530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn4rm\" (UniqueName: \"kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.163666 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.163617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.163819 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.163672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.163819 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.163741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn4rm\" (UniqueName: \"kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.164085 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.164065 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.164123 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.164079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.174033 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.174006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn4rm\" (UniqueName: \"kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.301321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.301252 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:09.418693 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.418669 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf"] Apr 19 12:17:09.419612 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:09.419588 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a1a0fbb_ddda_40c3_9d18_247790359cfa.slice/crio-152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3 WatchSource:0}: Error finding container 152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3: Status 404 returned error can't find the container with id 152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3 Apr 19 12:17:09.542952 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.542922 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerID="47a614527779d4f1a7b24dfbc02e55503aa9a834eccdead6750b973a5f4530ae" exitCode=0 Apr 19 12:17:09.543078 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.542969 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" event={"ID":"6a1a0fbb-ddda-40c3-9d18-247790359cfa","Type":"ContainerDied","Data":"47a614527779d4f1a7b24dfbc02e55503aa9a834eccdead6750b973a5f4530ae"} Apr 19 12:17:09.543078 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.542997 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" event={"ID":"6a1a0fbb-ddda-40c3-9d18-247790359cfa","Type":"ContainerStarted","Data":"152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3"} Apr 19 12:17:09.544528 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.544509 2568 generic.go:358] "Generic (PLEG): container finished" podID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerID="fcf12284cbc568be358ed10db91db19d65cfeca52dc78762a4efdd35c96cdc83" exitCode=0 Apr 19 12:17:09.544612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.544569 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" event={"ID":"1974cf29-120a-4b71-9c55-8896b6d353a9","Type":"ContainerDied","Data":"fcf12284cbc568be358ed10db91db19d65cfeca52dc78762a4efdd35c96cdc83"} Apr 19 12:17:09.544612 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.544586 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" event={"ID":"1974cf29-120a-4b71-9c55-8896b6d353a9","Type":"ContainerStarted","Data":"03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a"} Apr 19 12:17:09.588672 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.588647 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9"] Apr 19 12:17:09.592100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.592085 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.598200 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.598176 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9"] Apr 19 12:17:09.667485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.667453 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.667618 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.667507 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxqw8\" (UniqueName: \"kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.667618 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.667560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.768719 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.768690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxqw8\" (UniqueName: \"kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.768869 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.768740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.768869 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.768806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.769212 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.769192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.769248 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.769208 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.775730 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.775703 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxqw8\" (UniqueName: \"kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:09.901806 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:09.901734 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:10.023439 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.023414 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9"] Apr 19 12:17:10.028589 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:10.028559 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02298eb0_c2a6_414b_ac9a_3880f185b1f9.slice/crio-6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be WatchSource:0}: Error finding container 6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be: Status 404 returned error can't find the container with id 6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be Apr 19 12:17:10.185903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.185875 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7"] Apr 19 12:17:10.189308 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.189290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.206184 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.206009 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7"] Apr 19 12:17:10.273592 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.273560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.273699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.273603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.273749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.273728 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.374227 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.374202 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.374330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.374269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.374330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.374303 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.374662 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.374615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.374701 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.374648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.381921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.381905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.506670 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.506639 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:10.550922 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.550883 2568 generic.go:358] "Generic (PLEG): container finished" podID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerID="7bf075579f409e6b34c7146f1a47d64487389ab9f6b1ffcb53c2e84077c70c9f" exitCode=0 Apr 19 12:17:10.551319 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.550998 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" event={"ID":"1974cf29-120a-4b71-9c55-8896b6d353a9","Type":"ContainerDied","Data":"7bf075579f409e6b34c7146f1a47d64487389ab9f6b1ffcb53c2e84077c70c9f"} Apr 19 12:17:10.553896 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.553545 2568 generic.go:358] "Generic (PLEG): container finished" podID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerID="f99ef9120b57675aef3e4ec1473872d1ab8cf7e4fc33ff1dc844708fa8455ddd" exitCode=0 Apr 19 12:17:10.553896 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.553682 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" event={"ID":"02298eb0-c2a6-414b-ac9a-3880f185b1f9","Type":"ContainerDied","Data":"f99ef9120b57675aef3e4ec1473872d1ab8cf7e4fc33ff1dc844708fa8455ddd"} Apr 19 12:17:10.553896 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.553709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" event={"ID":"02298eb0-c2a6-414b-ac9a-3880f185b1f9","Type":"ContainerStarted","Data":"6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be"} Apr 19 12:17:10.556823 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.556510 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerID="7df735e1baf84958d283002b30cdf0ad761ae24405e8249d6a32f53f6ec025f4" exitCode=0 Apr 19 12:17:10.556823 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.556580 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" event={"ID":"6a1a0fbb-ddda-40c3-9d18-247790359cfa","Type":"ContainerDied","Data":"7df735e1baf84958d283002b30cdf0ad761ae24405e8249d6a32f53f6ec025f4"} Apr 19 12:17:10.635919 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:10.635892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7"] Apr 19 12:17:10.637133 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:10.637106 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a4d9ef_7961_4947_a625_743fc70d81ca.slice/crio-85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720 WatchSource:0}: Error finding container 85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720: Status 404 returned error can't find the container with id 85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720 Apr 19 12:17:11.562442 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.562357 2568 generic.go:358] "Generic (PLEG): container finished" podID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerID="f122174dd316d59727ab84483e707b7c8159d9a5f1fd561abdc3a3c2c2a1eab6" exitCode=0 Apr 19 12:17:11.562864 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.562441 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" event={"ID":"6a1a0fbb-ddda-40c3-9d18-247790359cfa","Type":"ContainerDied","Data":"f122174dd316d59727ab84483e707b7c8159d9a5f1fd561abdc3a3c2c2a1eab6"} Apr 19 12:17:11.564336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.564313 2568 generic.go:358] "Generic (PLEG): container finished" podID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerID="c153e8b545fd48c560ad7292e650a244fdbb9409826924c4f0c827dbcea4f585" exitCode=0 Apr 19 12:17:11.564464 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.564370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" event={"ID":"1974cf29-120a-4b71-9c55-8896b6d353a9","Type":"ContainerDied","Data":"c153e8b545fd48c560ad7292e650a244fdbb9409826924c4f0c827dbcea4f585"} Apr 19 12:17:11.565715 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.565686 2568 generic.go:358] "Generic (PLEG): container finished" podID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerID="d7efc14c86d8ef20c649c0c147be78dc1f9ec29eace89b0d4795466616a9a8f4" exitCode=0 Apr 19 12:17:11.565831 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.565767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" event={"ID":"d1a4d9ef-7961-4947-a625-743fc70d81ca","Type":"ContainerDied","Data":"d7efc14c86d8ef20c649c0c147be78dc1f9ec29eace89b0d4795466616a9a8f4"} Apr 19 12:17:11.565831 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.565794 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" event={"ID":"d1a4d9ef-7961-4947-a625-743fc70d81ca","Type":"ContainerStarted","Data":"85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720"} Apr 19 12:17:11.567467 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.567447 2568 generic.go:358] "Generic (PLEG): container finished" podID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerID="425e9404a601c53ba7b6b4e53bbc9109b06a5c60727531e906f506f53df25105" exitCode=0 Apr 19 12:17:11.567689 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:11.567486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" event={"ID":"02298eb0-c2a6-414b-ac9a-3880f185b1f9","Type":"ContainerDied","Data":"425e9404a601c53ba7b6b4e53bbc9109b06a5c60727531e906f506f53df25105"} Apr 19 12:17:12.573086 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.573055 2568 generic.go:358] "Generic (PLEG): container finished" podID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerID="6d5c01bb5286ff404f1ed6b2e18f8e2689bd24882368101b0e5ebe45aaf5bd02" exitCode=0 Apr 19 12:17:12.573500 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.573150 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" event={"ID":"d1a4d9ef-7961-4947-a625-743fc70d81ca","Type":"ContainerDied","Data":"6d5c01bb5286ff404f1ed6b2e18f8e2689bd24882368101b0e5ebe45aaf5bd02"} Apr 19 12:17:12.575162 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.575141 2568 generic.go:358] "Generic (PLEG): container finished" podID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerID="c0096f91b5226147676cc3aa00fe8d256f3fb31966ea7d6e34adc4def5c31c7c" exitCode=0 Apr 19 12:17:12.575253 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.575228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" event={"ID":"02298eb0-c2a6-414b-ac9a-3880f185b1f9","Type":"ContainerDied","Data":"c0096f91b5226147676cc3aa00fe8d256f3fb31966ea7d6e34adc4def5c31c7c"} Apr 19 12:17:12.717866 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.717845 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:12.797560 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.797527 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn4rm\" (UniqueName: \"kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm\") pod \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " Apr 19 12:17:12.797732 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.797581 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util\") pod \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " Apr 19 12:17:12.797732 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.797600 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle\") pod \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\" (UID: \"6a1a0fbb-ddda-40c3-9d18-247790359cfa\") " Apr 19 12:17:12.798148 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.798115 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle" (OuterVolumeSpecName: "bundle") pod "6a1a0fbb-ddda-40c3-9d18-247790359cfa" (UID: "6a1a0fbb-ddda-40c3-9d18-247790359cfa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:12.799758 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.799736 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm" (OuterVolumeSpecName: "kube-api-access-cn4rm") pod "6a1a0fbb-ddda-40c3-9d18-247790359cfa" (UID: "6a1a0fbb-ddda-40c3-9d18-247790359cfa"). InnerVolumeSpecName "kube-api-access-cn4rm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:12.802978 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.802955 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util" (OuterVolumeSpecName: "util") pod "6a1a0fbb-ddda-40c3-9d18-247790359cfa" (UID: "6a1a0fbb-ddda-40c3-9d18-247790359cfa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:12.810237 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.810221 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:12.898399 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898332 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle\") pod \"1974cf29-120a-4b71-9c55-8896b6d353a9\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " Apr 19 12:17:12.898514 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898400 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util\") pod \"1974cf29-120a-4b71-9c55-8896b6d353a9\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " Apr 19 12:17:12.898514 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898435 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49mrm\" (UniqueName: \"kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm\") pod \"1974cf29-120a-4b71-9c55-8896b6d353a9\" (UID: \"1974cf29-120a-4b71-9c55-8896b6d353a9\") " Apr 19 12:17:12.898678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898662 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cn4rm\" (UniqueName: \"kubernetes.io/projected/6a1a0fbb-ddda-40c3-9d18-247790359cfa-kube-api-access-cn4rm\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:12.898718 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898685 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:12.898718 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898702 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1a0fbb-ddda-40c3-9d18-247790359cfa-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:12.898915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.898893 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle" (OuterVolumeSpecName: "bundle") pod "1974cf29-120a-4b71-9c55-8896b6d353a9" (UID: "1974cf29-120a-4b71-9c55-8896b6d353a9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:12.900412 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.900389 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm" (OuterVolumeSpecName: "kube-api-access-49mrm") pod "1974cf29-120a-4b71-9c55-8896b6d353a9" (UID: "1974cf29-120a-4b71-9c55-8896b6d353a9"). InnerVolumeSpecName "kube-api-access-49mrm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:12.903253 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.903221 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util" (OuterVolumeSpecName: "util") pod "1974cf29-120a-4b71-9c55-8896b6d353a9" (UID: "1974cf29-120a-4b71-9c55-8896b6d353a9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:12.999228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.999207 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:12.999228 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.999227 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49mrm\" (UniqueName: \"kubernetes.io/projected/1974cf29-120a-4b71-9c55-8896b6d353a9-kube-api-access-49mrm\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:12.999329 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:12.999238 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1974cf29-120a-4b71-9c55-8896b6d353a9-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:13.579996 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.579969 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" Apr 19 12:17:13.579996 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.579984 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf" event={"ID":"6a1a0fbb-ddda-40c3-9d18-247790359cfa","Type":"ContainerDied","Data":"152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3"} Apr 19 12:17:13.580485 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.580013 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152c5f8a3ef491c4fa8695634f0daab901d6dc27be01b0504462efef1d4f1ef3" Apr 19 12:17:13.581691 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.581663 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" event={"ID":"1974cf29-120a-4b71-9c55-8896b6d353a9","Type":"ContainerDied","Data":"03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a"} Apr 19 12:17:13.581797 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.581692 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb" Apr 19 12:17:13.581797 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.581694 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03fa5e83d66dd2c023b3ae0ab330ea66d681ee6a6777c203b339fa47c1248e9a" Apr 19 12:17:13.583453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.583426 2568 generic.go:358] "Generic (PLEG): container finished" podID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerID="70592cc1e528704940d78a6b004a6227883d713019d3c85fb5ca891186541275" exitCode=0 Apr 19 12:17:13.583573 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.583506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" event={"ID":"d1a4d9ef-7961-4947-a625-743fc70d81ca","Type":"ContainerDied","Data":"70592cc1e528704940d78a6b004a6227883d713019d3c85fb5ca891186541275"} Apr 19 12:17:13.712395 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.712374 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:13.806881 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.806839 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle\") pod \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " Apr 19 12:17:13.807079 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.806892 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxqw8\" (UniqueName: \"kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8\") pod \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " Apr 19 12:17:13.807079 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.806937 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util\") pod \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\" (UID: \"02298eb0-c2a6-414b-ac9a-3880f185b1f9\") " Apr 19 12:17:13.807386 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.807360 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle" (OuterVolumeSpecName: "bundle") pod "02298eb0-c2a6-414b-ac9a-3880f185b1f9" (UID: "02298eb0-c2a6-414b-ac9a-3880f185b1f9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:13.809004 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.808978 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8" (OuterVolumeSpecName: "kube-api-access-gxqw8") pod "02298eb0-c2a6-414b-ac9a-3880f185b1f9" (UID: "02298eb0-c2a6-414b-ac9a-3880f185b1f9"). InnerVolumeSpecName "kube-api-access-gxqw8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:13.811906 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.811838 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util" (OuterVolumeSpecName: "util") pod "02298eb0-c2a6-414b-ac9a-3880f185b1f9" (UID: "02298eb0-c2a6-414b-ac9a-3880f185b1f9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:13.907980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.907942 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:13.907980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.907976 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxqw8\" (UniqueName: \"kubernetes.io/projected/02298eb0-c2a6-414b-ac9a-3880f185b1f9-kube-api-access-gxqw8\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:13.907980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:13.907987 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/02298eb0-c2a6-414b-ac9a-3880f185b1f9-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:14.589330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.589297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" event={"ID":"02298eb0-c2a6-414b-ac9a-3880f185b1f9","Type":"ContainerDied","Data":"6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be"} Apr 19 12:17:14.589330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.589332 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e4449a0737ead3e09f5671fb6b86be78499f5b86c7c4d4c53db4c31a672f5be" Apr 19 12:17:14.589751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.589336 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9" Apr 19 12:17:14.719322 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.719301 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:14.814641 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.814607 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util\") pod \"d1a4d9ef-7961-4947-a625-743fc70d81ca\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " Apr 19 12:17:14.814794 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.814721 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7\") pod \"d1a4d9ef-7961-4947-a625-743fc70d81ca\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " Apr 19 12:17:14.814794 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.814752 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle\") pod \"d1a4d9ef-7961-4947-a625-743fc70d81ca\" (UID: \"d1a4d9ef-7961-4947-a625-743fc70d81ca\") " Apr 19 12:17:14.815433 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.815395 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle" (OuterVolumeSpecName: "bundle") pod "d1a4d9ef-7961-4947-a625-743fc70d81ca" (UID: "d1a4d9ef-7961-4947-a625-743fc70d81ca"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:14.816728 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.816698 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7" (OuterVolumeSpecName: "kube-api-access-lwrx7") pod "d1a4d9ef-7961-4947-a625-743fc70d81ca" (UID: "d1a4d9ef-7961-4947-a625-743fc70d81ca"). InnerVolumeSpecName "kube-api-access-lwrx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:14.822397 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.822373 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util" (OuterVolumeSpecName: "util") pod "d1a4d9ef-7961-4947-a625-743fc70d81ca" (UID: "d1a4d9ef-7961-4947-a625-743fc70d81ca"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:14.915604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.915527 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-util\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:14.915604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.915552 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lwrx7\" (UniqueName: \"kubernetes.io/projected/d1a4d9ef-7961-4947-a625-743fc70d81ca-kube-api-access-lwrx7\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:14.915604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:14.915563 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1a4d9ef-7961-4947-a625-743fc70d81ca-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:15.595681 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:15.595646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" event={"ID":"d1a4d9ef-7961-4947-a625-743fc70d81ca","Type":"ContainerDied","Data":"85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720"} Apr 19 12:17:15.595681 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:15.595683 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85de77cdba3e64ea86163822e8a34f9c2a585e33b664f32ea2e2a757e9a90720" Apr 19 12:17:15.596084 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:15.595693 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7" Apr 19 12:17:21.504213 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504178 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64db9444b9-rlpzp"] Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504506 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="pull" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504516 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="pull" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504527 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="util" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504532 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="util" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504538 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="extract" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504543 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="extract" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504556 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="pull" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504561 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="pull" Apr 19 12:17:21.504563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504567 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504572 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504582 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504587 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504593 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504598 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504607 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="pull" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504612 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="pull" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504620 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504639 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504644 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504651 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504658 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="pull" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504663 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="pull" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504669 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504674 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="util" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504723 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1a4d9ef-7961-4947-a625-743fc70d81ca" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504731 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="02298eb0-c2a6-414b-ac9a-3880f185b1f9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504739 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1974cf29-120a-4b71-9c55-8896b6d353a9" containerName="extract" Apr 19 12:17:21.504854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.504746 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a1a0fbb-ddda-40c3-9d18-247790359cfa" containerName="extract" Apr 19 12:17:21.509037 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.509020 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.525065 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.525037 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64db9444b9-rlpzp"] Apr 19 12:17:21.675001 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.674968 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-service-ca\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675001 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675007 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-oauth-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675032 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675049 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-oauth-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsrr\" (UniqueName: \"kubernetes.io/projected/21388144-4fec-41b8-bea8-923e7a8c17ab-kube-api-access-fwsrr\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675208 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-console-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.675256 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.675255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-trusted-ca-bundle\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.775983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.775907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.775983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.775938 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-oauth-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.775983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.775956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsrr\" (UniqueName: \"kubernetes.io/projected/21388144-4fec-41b8-bea8-923e7a8c17ab-kube-api-access-fwsrr\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-console-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-trusted-ca-bundle\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776063 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-service-ca\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-oauth-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776849 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-oauth-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.776955 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.776827 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-console-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.777072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.777046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-service-ca\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.777112 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.777055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21388144-4fec-41b8-bea8-923e7a8c17ab-trusted-ca-bundle\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.778693 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.778667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-oauth-config\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.778770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.778667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/21388144-4fec-41b8-bea8-923e7a8c17ab-console-serving-cert\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.783764 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.783745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsrr\" (UniqueName: \"kubernetes.io/projected/21388144-4fec-41b8-bea8-923e7a8c17ab-kube-api-access-fwsrr\") pod \"console-64db9444b9-rlpzp\" (UID: \"21388144-4fec-41b8-bea8-923e7a8c17ab\") " pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.819003 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.818967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:21.951339 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:21.951311 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64db9444b9-rlpzp"] Apr 19 12:17:21.952446 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:21.952411 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21388144_4fec_41b8_bea8_923e7a8c17ab.slice/crio-3eef46fb71e039898d94056f5494dbf92a5b14b668bb7cdd3b3c2b9787dfbcf8 WatchSource:0}: Error finding container 3eef46fb71e039898d94056f5494dbf92a5b14b668bb7cdd3b3c2b9787dfbcf8: Status 404 returned error can't find the container with id 3eef46fb71e039898d94056f5494dbf92a5b14b668bb7cdd3b3c2b9787dfbcf8 Apr 19 12:17:22.626770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:22.626738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64db9444b9-rlpzp" event={"ID":"21388144-4fec-41b8-bea8-923e7a8c17ab","Type":"ContainerStarted","Data":"65c5314ad5585b62244937d199dd7d6749f23a651966f29904059d74977603b1"} Apr 19 12:17:22.626770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:22.626772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64db9444b9-rlpzp" event={"ID":"21388144-4fec-41b8-bea8-923e7a8c17ab","Type":"ContainerStarted","Data":"3eef46fb71e039898d94056f5494dbf92a5b14b668bb7cdd3b3c2b9787dfbcf8"} Apr 19 12:17:22.644160 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:22.644116 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64db9444b9-rlpzp" podStartSLOduration=1.6441033269999998 podStartE2EDuration="1.644103327s" podCreationTimestamp="2026-04-19 12:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:17:22.641120871 +0000 UTC m=+461.359945037" watchObservedRunningTime="2026-04-19 12:17:22.644103327 +0000 UTC m=+461.362927482" Apr 19 12:17:31.820010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:31.819947 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:31.820473 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:31.820038 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:31.824860 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:31.824841 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:32.667678 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:32.667654 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64db9444b9-rlpzp" Apr 19 12:17:32.719231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:32.719192 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:17:37.166699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.166664 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf"] Apr 19 12:17:37.170275 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.170255 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.172893 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.172869 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:17:37.173015 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.172895 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-g6tzz\"" Apr 19 12:17:37.173015 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.172896 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:17:37.181244 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.181223 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf"] Apr 19 12:17:37.200820 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.200791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.200940 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.200866 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6j4v\" (UniqueName: \"kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.301760 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.301719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.301929 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.301811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6j4v\" (UniqueName: \"kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.302152 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.302128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.315560 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.310165 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6j4v\" (UniqueName: \"kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.481609 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.481581 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:37.612245 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.612212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf"] Apr 19 12:17:37.613124 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:37.613099 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod017bd88f_7de4_4ce7_bbce_dfbc6ce6e559.slice/crio-722e1a6746d89c6e3f075e8cceeeeb6e8b00373e556337b968d11f70e0083d28 WatchSource:0}: Error finding container 722e1a6746d89c6e3f075e8cceeeeb6e8b00373e556337b968d11f70e0083d28: Status 404 returned error can't find the container with id 722e1a6746d89c6e3f075e8cceeeeb6e8b00373e556337b968d11f70e0083d28 Apr 19 12:17:37.685010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:37.684976 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" event={"ID":"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559","Type":"ContainerStarted","Data":"722e1a6746d89c6e3f075e8cceeeeb6e8b00373e556337b968d11f70e0083d28"} Apr 19 12:17:43.712304 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:43.712260 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" event={"ID":"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559","Type":"ContainerStarted","Data":"a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14"} Apr 19 12:17:43.712772 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:43.712371 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:43.735950 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:43.735895 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" podStartSLOduration=1.319014799 podStartE2EDuration="6.7358789s" podCreationTimestamp="2026-04-19 12:17:37 +0000 UTC" firstStartedPulling="2026-04-19 12:17:37.615966259 +0000 UTC m=+476.334790395" lastFinishedPulling="2026-04-19 12:17:43.032830351 +0000 UTC m=+481.751654496" observedRunningTime="2026-04-19 12:17:43.735487318 +0000 UTC m=+482.454311474" watchObservedRunningTime="2026-04-19 12:17:43.7358789 +0000 UTC m=+482.454703056" Apr 19 12:17:54.718070 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:54.718034 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:55.629222 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.629183 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28"] Apr 19 12:17:55.635771 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.635744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.644557 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.644530 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28"] Apr 19 12:17:55.764456 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.764426 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.764851 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.764473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd85x\" (UniqueName: \"kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.865578 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.865548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd85x\" (UniqueName: \"kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.865755 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.865705 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.866103 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.866079 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.875980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.875960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd85x\" (UniqueName: \"kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:55.947311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:55.947289 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:56.279563 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.279537 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28"] Apr 19 12:17:56.281172 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:56.281102 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf54a7a8_b96d_4b07_aeed_3df69529ae98.slice/crio-0cd13572090912536cf1776ab07e30f3e94c0dd553aba73ea64fb4df22c85acc WatchSource:0}: Error finding container 0cd13572090912536cf1776ab07e30f3e94c0dd553aba73ea64fb4df22c85acc: Status 404 returned error can't find the container with id 0cd13572090912536cf1776ab07e30f3e94c0dd553aba73ea64fb4df22c85acc Apr 19 12:17:56.284076 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.284052 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf"] Apr 19 12:17:56.284304 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.284254 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" containerName="manager" containerID="cri-o://a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14" gracePeriod=2 Apr 19 12:17:56.290832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.290807 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf"] Apr 19 12:17:56.303013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.302921 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28"] Apr 19 12:17:56.309709 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.309685 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:17:56.310235 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.310217 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" containerName="manager" Apr 19 12:17:56.310307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.310239 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" containerName="manager" Apr 19 12:17:56.310359 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.310348 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" containerName="manager" Apr 19 12:17:56.314108 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.314090 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28"] Apr 19 12:17:56.314248 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.314222 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.316382 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.316356 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.320845 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.320825 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:17:56.332320 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.332298 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:17:56.336272 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.336253 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.346995 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.346971 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:17:56.350246 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.350195 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.370957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.370900 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.371834 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.370997 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq785\" (UniqueName: \"kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.472205 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.472182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx65b\" (UniqueName: \"kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.472327 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.472226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq785\" (UniqueName: \"kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.472393 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.472378 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.472453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.472423 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.472727 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.472708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.479512 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.479492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq785\" (UniqueName: \"kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gv5dg\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.513692 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.513675 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:56.515616 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.515595 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.573857 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.573797 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume\") pod \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " Apr 19 12:17:56.573857 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.573830 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6j4v\" (UniqueName: \"kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v\") pod \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\" (UID: \"017bd88f-7de4-4ce7-bbce-dfbc6ce6e559\") " Apr 19 12:17:56.574019 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.574000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.574076 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.574059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx65b\" (UniqueName: \"kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.574275 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.574250 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" (UID: "017bd88f-7de4-4ce7-bbce-dfbc6ce6e559"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:56.574369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.574348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.575886 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.575866 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v" (OuterVolumeSpecName: "kube-api-access-d6j4v") pod "017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" (UID: "017bd88f-7de4-4ce7-bbce-dfbc6ce6e559"). InnerVolumeSpecName "kube-api-access-d6j4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:56.582796 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.582777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx65b\" (UniqueName: \"kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-79d5m\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.661895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.661847 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:56.669672 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.669651 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:56.674529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.674491 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-extensions-socket-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:56.674529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.674525 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d6j4v\" (UniqueName: \"kubernetes.io/projected/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559-kube-api-access-d6j4v\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:56.766376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.766346 2568 generic.go:358] "Generic (PLEG): container finished" podID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" containerID="a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14" exitCode=0 Apr 19 12:17:56.766806 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.766384 2568 scope.go:117] "RemoveContainer" containerID="a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14" Apr 19 12:17:56.766806 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.766392 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" Apr 19 12:17:56.768951 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.768569 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.769390 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.769206 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" containerName="manager" containerID="cri-o://b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a" gracePeriod=2 Apr 19 12:17:56.771298 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.771270 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.773853 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.773810 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.780906 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.780811 2568 scope.go:117] "RemoveContainer" containerID="a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14" Apr 19 12:17:56.781142 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:17:56.781116 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14\": container with ID starting with a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14 not found: ID does not exist" containerID="a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14" Apr 19 12:17:56.781231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.781154 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14"} err="failed to get container status \"a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14\": rpc error: code = NotFound desc = could not find container \"a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14\": container with ID starting with a586f82eade4d243bac5389699dbf2e8aa2a578eb1fc26e1cd9136444fa1cb14 not found: ID does not exist" Apr 19 12:17:56.801909 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.801844 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:17:56.811607 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.811578 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.813296 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.813278 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:56.823103 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:56.822907 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod882f21fe_f287_438b_89d5_694148e82a25.slice/crio-5bec3af4b6cca74428933d68eecd7479fd30ad0d1da04425552a89483adcf0ba WatchSource:0}: Error finding container 5bec3af4b6cca74428933d68eecd7479fd30ad0d1da04425552a89483adcf0ba: Status 404 returned error can't find the container with id 5bec3af4b6cca74428933d68eecd7479fd30ad0d1da04425552a89483adcf0ba Apr 19 12:17:56.827599 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:56.827577 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:17:56.827727 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:17:56.827708 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3572a677_8591_4227_9cc8_92cd786257cc.slice/crio-f054af19cfc3846406ed11cc3a93636a4b66e3aa88e33cfefe6ae9ef5220ee66 WatchSource:0}: Error finding container f054af19cfc3846406ed11cc3a93636a4b66e3aa88e33cfefe6ae9ef5220ee66: Status 404 returned error can't find the container with id f054af19cfc3846406ed11cc3a93636a4b66e3aa88e33cfefe6ae9ef5220ee66 Apr 19 12:17:57.010893 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.010871 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:57.013891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.013869 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.015657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.015636 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.077874 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.077810 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd85x\" (UniqueName: \"kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x\") pod \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " Apr 19 12:17:57.077991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.077894 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume\") pod \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\" (UID: \"bf54a7a8-b96d-4b07-aeed-3df69529ae98\") " Apr 19 12:17:57.078140 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.078114 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "bf54a7a8-b96d-4b07-aeed-3df69529ae98" (UID: "bf54a7a8-b96d-4b07-aeed-3df69529ae98"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:17:57.079897 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.079874 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x" (OuterVolumeSpecName: "kube-api-access-sd85x") pod "bf54a7a8-b96d-4b07-aeed-3df69529ae98" (UID: "bf54a7a8-b96d-4b07-aeed-3df69529ae98"). InnerVolumeSpecName "kube-api-access-sd85x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:57.178868 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.178845 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sd85x\" (UniqueName: \"kubernetes.io/projected/bf54a7a8-b96d-4b07-aeed-3df69529ae98-kube-api-access-sd85x\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:57.178868 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.178867 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bf54a7a8-b96d-4b07-aeed-3df69529ae98-extensions-socket-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:57.747641 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.747581 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-779499566f-wd8rw" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerName="console" containerID="cri-o://b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e" gracePeriod=15 Apr 19 12:17:57.775279 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.775250 2568 generic.go:358] "Generic (PLEG): container finished" podID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" containerID="b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a" exitCode=2 Apr 19 12:17:57.775619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.775298 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" Apr 19 12:17:57.775619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.775330 2568 scope.go:117] "RemoveContainer" containerID="b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a" Apr 19 12:17:57.777036 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.777011 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" event={"ID":"882f21fe-f287-438b-89d5-694148e82a25","Type":"ContainerStarted","Data":"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002"} Apr 19 12:17:57.777146 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.777046 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" event={"ID":"882f21fe-f287-438b-89d5-694148e82a25","Type":"ContainerStarted","Data":"5bec3af4b6cca74428933d68eecd7479fd30ad0d1da04425552a89483adcf0ba"} Apr 19 12:17:57.777146 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.777103 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:17:57.780307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.777703 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.780531 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.780505 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.781957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.781937 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" event={"ID":"3572a677-8591-4227-9cc8-92cd786257cc","Type":"ContainerStarted","Data":"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61"} Apr 19 12:17:57.782062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.781961 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" event={"ID":"3572a677-8591-4227-9cc8-92cd786257cc","Type":"ContainerStarted","Data":"f054af19cfc3846406ed11cc3a93636a4b66e3aa88e33cfefe6ae9ef5220ee66"} Apr 19 12:17:57.782062 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.782041 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:17:57.805338 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.805305 2568 patch_prober.go:28] interesting pod/console-779499566f-wd8rw container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.132.0.22:8443/health\": dial tcp 10.132.0.22:8443: connect: connection refused" start-of-body= Apr 19 12:17:57.805444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.805372 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-779499566f-wd8rw" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerName="console" probeResult="failure" output="Get \"https://10.132.0.22:8443/health\": dial tcp 10.132.0.22:8443: connect: connection refused" Apr 19 12:17:57.806657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.806563 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" podStartSLOduration=1.8065540850000001 podStartE2EDuration="1.806554085s" podCreationTimestamp="2026-04-19 12:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:17:57.805431478 +0000 UTC m=+496.524255635" watchObservedRunningTime="2026-04-19 12:17:57.806554085 +0000 UTC m=+496.525378239" Apr 19 12:17:57.807191 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.807163 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.808700 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.808673 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.810443 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.810419 2568 status_manager.go:895] "Failed to get status for pod" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-8vvtf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-8vvtf\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.829438 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.829408 2568 status_manager.go:895] "Failed to get status for pod" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-f9b28" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-f9b28\" is forbidden: User \"system:node:ip-10-0-140-225.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-225.ec2.internal' and this object" Apr 19 12:17:57.830043 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.830000 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" podStartSLOduration=1.829983635 podStartE2EDuration="1.829983635s" podCreationTimestamp="2026-04-19 12:17:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:17:57.827489134 +0000 UTC m=+496.546313303" watchObservedRunningTime="2026-04-19 12:17:57.829983635 +0000 UTC m=+496.548807793" Apr 19 12:17:57.859086 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.859062 2568 scope.go:117] "RemoveContainer" containerID="b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a" Apr 19 12:17:57.859364 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:17:57.859344 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a\": container with ID starting with b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a not found: ID does not exist" containerID="b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a" Apr 19 12:17:57.859448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.859375 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a"} err="failed to get container status \"b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a\": rpc error: code = NotFound desc = could not find container \"b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a\": container with ID starting with b87384ab883bdd76578c90be535f51870ca04057f4565c8f19b62103b9c6801a not found: ID does not exist" Apr 19 12:17:57.868619 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.868598 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="017bd88f-7de4-4ce7-bbce-dfbc6ce6e559" path="/var/lib/kubelet/pods/017bd88f-7de4-4ce7-bbce-dfbc6ce6e559/volumes" Apr 19 12:17:57.868994 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.868979 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" path="/var/lib/kubelet/pods/bf54a7a8-b96d-4b07-aeed-3df69529ae98/volumes" Apr 19 12:17:57.994458 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.994434 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-779499566f-wd8rw_c4505d4d-1a94-46df-9a91-58f9beb22fe2/console/0.log" Apr 19 12:17:57.994578 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:57.994496 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:17:58.086793 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086722 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086793 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086762 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086969 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086804 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086969 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086831 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rz2v\" (UniqueName: \"kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086969 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086860 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086969 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086909 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.086969 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.086959 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle\") pod \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\" (UID: \"c4505d4d-1a94-46df-9a91-58f9beb22fe2\") " Apr 19 12:17:58.087197 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.087130 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config" (OuterVolumeSpecName: "console-config") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:17:58.087257 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.087225 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca" (OuterVolumeSpecName: "service-ca") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:17:58.087257 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.087239 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:17:58.087573 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.087544 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:17:58.089010 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.088981 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:17:58.089105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.089067 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v" (OuterVolumeSpecName: "kube-api-access-9rz2v") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "kube-api-access-9rz2v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:17:58.089190 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.089174 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c4505d4d-1a94-46df-9a91-58f9beb22fe2" (UID: "c4505d4d-1a94-46df-9a91-58f9beb22fe2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:17:58.188188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188164 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-serving-cert\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188188 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188188 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-trusted-ca-bundle\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188343 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188197 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188343 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188207 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-service-ca\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188343 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188215 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c4505d4d-1a94-46df-9a91-58f9beb22fe2-oauth-serving-cert\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188343 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188226 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9rz2v\" (UniqueName: \"kubernetes.io/projected/c4505d4d-1a94-46df-9a91-58f9beb22fe2-kube-api-access-9rz2v\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.188343 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.188238 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c4505d4d-1a94-46df-9a91-58f9beb22fe2-console-oauth-config\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:17:58.786992 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.786964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-779499566f-wd8rw_c4505d4d-1a94-46df-9a91-58f9beb22fe2/console/0.log" Apr 19 12:17:58.787373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.787007 2568 generic.go:358] "Generic (PLEG): container finished" podID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerID="b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e" exitCode=2 Apr 19 12:17:58.787373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.787105 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-779499566f-wd8rw" Apr 19 12:17:58.787373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.787097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779499566f-wd8rw" event={"ID":"c4505d4d-1a94-46df-9a91-58f9beb22fe2","Type":"ContainerDied","Data":"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e"} Apr 19 12:17:58.787373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.787238 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-779499566f-wd8rw" event={"ID":"c4505d4d-1a94-46df-9a91-58f9beb22fe2","Type":"ContainerDied","Data":"bc345aaed6dffd98f63466e12642566a1f9c3e446405bc7073b21e615f90575f"} Apr 19 12:17:58.787373 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.787281 2568 scope.go:117] "RemoveContainer" containerID="b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e" Apr 19 12:17:58.796521 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.796501 2568 scope.go:117] "RemoveContainer" containerID="b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e" Apr 19 12:17:58.796825 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:17:58.796798 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e\": container with ID starting with b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e not found: ID does not exist" containerID="b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e" Apr 19 12:17:58.796931 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.796829 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e"} err="failed to get container status \"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e\": rpc error: code = NotFound desc = could not find container \"b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e\": container with ID starting with b3c93103c92a0e0ed2f7a15e7511a9d1a76791d9eb876c9ab2e6310f391d607e not found: ID does not exist" Apr 19 12:17:58.821138 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.821117 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:17:58.824649 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:58.824612 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-779499566f-wd8rw"] Apr 19 12:17:59.869617 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:17:59.869586 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" path="/var/lib/kubelet/pods/c4505d4d-1a94-46df-9a91-58f9beb22fe2/volumes" Apr 19 12:18:08.790445 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:08.790411 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:18:08.790824 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:08.790465 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:18:08.859794 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:08.859762 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:18:08.860075 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:08.859946 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" podUID="882f21fe-f287-438b-89d5-694148e82a25" containerName="manager" containerID="cri-o://47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002" gracePeriod=10 Apr 19 12:18:09.081547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081517 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:18:09.081908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081895 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" containerName="manager" Apr 19 12:18:09.081956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081910 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" containerName="manager" Apr 19 12:18:09.081956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081924 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerName="console" Apr 19 12:18:09.081956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081929 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerName="console" Apr 19 12:18:09.082071 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.081997 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf54a7a8-b96d-4b07-aeed-3df69529ae98" containerName="manager" Apr 19 12:18:09.082071 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.082005 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4505d4d-1a94-46df-9a91-58f9beb22fe2" containerName="console" Apr 19 12:18:09.085087 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.085062 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.094755 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.094731 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:18:09.109281 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.109254 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:18:09.181944 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.181915 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq785\" (UniqueName: \"kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785\") pod \"882f21fe-f287-438b-89d5-694148e82a25\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " Apr 19 12:18:09.182100 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.181996 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume\") pod \"882f21fe-f287-438b-89d5-694148e82a25\" (UID: \"882f21fe-f287-438b-89d5-694148e82a25\") " Apr 19 12:18:09.182212 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.182194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klskv\" (UniqueName: \"kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.182311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.182296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.182387 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.182363 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "882f21fe-f287-438b-89d5-694148e82a25" (UID: "882f21fe-f287-438b-89d5-694148e82a25"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:18:09.183849 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.183829 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785" (OuterVolumeSpecName: "kube-api-access-xq785") pod "882f21fe-f287-438b-89d5-694148e82a25" (UID: "882f21fe-f287-438b-89d5-694148e82a25"). InnerVolumeSpecName "kube-api-access-xq785". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:18:09.283151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.283120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klskv\" (UniqueName: \"kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.283306 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.283176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.283306 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.283235 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xq785\" (UniqueName: \"kubernetes.io/projected/882f21fe-f287-438b-89d5-694148e82a25-kube-api-access-xq785\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:09.283306 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.283246 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/882f21fe-f287-438b-89d5-694148e82a25-extensions-socket-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:09.283569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.283553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.291242 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.291209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klskv\" (UniqueName: \"kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zh97k\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.406991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.406905 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.531917 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.531892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:18:09.533334 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:18:09.533308 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ae5e67_fd4a_40b2_b192_c1021c3b37bb.slice/crio-eeaa951fc6d1165fc2886b6148b07d71743f1c5445cf88fa213ba7f67ef8e6d9 WatchSource:0}: Error finding container eeaa951fc6d1165fc2886b6148b07d71743f1c5445cf88fa213ba7f67ef8e6d9: Status 404 returned error can't find the container with id eeaa951fc6d1165fc2886b6148b07d71743f1c5445cf88fa213ba7f67ef8e6d9 Apr 19 12:18:09.833890 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.833860 2568 generic.go:358] "Generic (PLEG): container finished" podID="882f21fe-f287-438b-89d5-694148e82a25" containerID="47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002" exitCode=0 Apr 19 12:18:09.834321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.833918 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" Apr 19 12:18:09.834321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.833940 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" event={"ID":"882f21fe-f287-438b-89d5-694148e82a25","Type":"ContainerDied","Data":"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002"} Apr 19 12:18:09.834321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.833983 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg" event={"ID":"882f21fe-f287-438b-89d5-694148e82a25","Type":"ContainerDied","Data":"5bec3af4b6cca74428933d68eecd7479fd30ad0d1da04425552a89483adcf0ba"} Apr 19 12:18:09.834321 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.834000 2568 scope.go:117] "RemoveContainer" containerID="47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002" Apr 19 12:18:09.835681 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.835659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" event={"ID":"41ae5e67-fd4a-40b2-b192-c1021c3b37bb","Type":"ContainerStarted","Data":"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7"} Apr 19 12:18:09.835779 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.835687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" event={"ID":"41ae5e67-fd4a-40b2-b192-c1021c3b37bb","Type":"ContainerStarted","Data":"eeaa951fc6d1165fc2886b6148b07d71743f1c5445cf88fa213ba7f67ef8e6d9"} Apr 19 12:18:09.835941 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.835780 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:09.843006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.842990 2568 scope.go:117] "RemoveContainer" containerID="47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002" Apr 19 12:18:09.843246 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:18:09.843228 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002\": container with ID starting with 47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002 not found: ID does not exist" containerID="47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002" Apr 19 12:18:09.843297 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.843253 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002"} err="failed to get container status \"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002\": rpc error: code = NotFound desc = could not find container \"47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002\": container with ID starting with 47ae62f84294fbe8bf75bdd6f6a4675861f67cb977977a9bcd3b96f5e7531002 not found: ID does not exist" Apr 19 12:18:09.860282 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.860240 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" podStartSLOduration=0.86022854 podStartE2EDuration="860.22854ms" podCreationTimestamp="2026-04-19 12:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:18:09.8583359 +0000 UTC m=+508.577160070" watchObservedRunningTime="2026-04-19 12:18:09.86022854 +0000 UTC m=+508.579052740" Apr 19 12:18:09.880166 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.880146 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:18:09.886089 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:09.886066 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gv5dg"] Apr 19 12:18:11.869474 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:11.869443 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="882f21fe-f287-438b-89d5-694148e82a25" path="/var/lib/kubelet/pods/882f21fe-f287-438b-89d5-694148e82a25/volumes" Apr 19 12:18:20.843089 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:20.843062 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:18:20.897669 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:20.897620 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:18:20.897964 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:20.897938 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" podUID="3572a677-8591-4227-9cc8-92cd786257cc" containerName="manager" containerID="cri-o://15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61" gracePeriod=10 Apr 19 12:18:21.140336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.140315 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:18:21.292869 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.292826 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume\") pod \"3572a677-8591-4227-9cc8-92cd786257cc\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " Apr 19 12:18:21.293068 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.292906 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx65b\" (UniqueName: \"kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b\") pod \"3572a677-8591-4227-9cc8-92cd786257cc\" (UID: \"3572a677-8591-4227-9cc8-92cd786257cc\") " Apr 19 12:18:21.293267 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.293241 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "3572a677-8591-4227-9cc8-92cd786257cc" (UID: "3572a677-8591-4227-9cc8-92cd786257cc"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:18:21.294958 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.294937 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b" (OuterVolumeSpecName: "kube-api-access-kx65b") pod "3572a677-8591-4227-9cc8-92cd786257cc" (UID: "3572a677-8591-4227-9cc8-92cd786257cc"). InnerVolumeSpecName "kube-api-access-kx65b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:18:21.393818 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.393751 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kx65b\" (UniqueName: \"kubernetes.io/projected/3572a677-8591-4227-9cc8-92cd786257cc-kube-api-access-kx65b\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:21.393818 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.393779 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/3572a677-8591-4227-9cc8-92cd786257cc-extensions-socket-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:21.894636 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.894597 2568 generic.go:358] "Generic (PLEG): container finished" podID="3572a677-8591-4227-9cc8-92cd786257cc" containerID="15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61" exitCode=0 Apr 19 12:18:21.895030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.894669 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" event={"ID":"3572a677-8591-4227-9cc8-92cd786257cc","Type":"ContainerDied","Data":"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61"} Apr 19 12:18:21.895030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.894685 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" Apr 19 12:18:21.895030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.894709 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m" event={"ID":"3572a677-8591-4227-9cc8-92cd786257cc","Type":"ContainerDied","Data":"f054af19cfc3846406ed11cc3a93636a4b66e3aa88e33cfefe6ae9ef5220ee66"} Apr 19 12:18:21.895030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.894727 2568 scope.go:117] "RemoveContainer" containerID="15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61" Apr 19 12:18:21.905164 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.904995 2568 scope.go:117] "RemoveContainer" containerID="15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61" Apr 19 12:18:21.905260 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:18:21.905241 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61\": container with ID starting with 15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61 not found: ID does not exist" containerID="15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61" Apr 19 12:18:21.905310 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.905274 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61"} err="failed to get container status \"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61\": rpc error: code = NotFound desc = could not find container \"15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61\": container with ID starting with 15c82b3bb75ff614636992661003c33c00f13938402290d06d76d01fce197a61 not found: ID does not exist" Apr 19 12:18:21.914138 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.914117 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:18:21.917883 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:21.917864 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-79d5m"] Apr 19 12:18:23.869421 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:23.869381 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3572a677-8591-4227-9cc8-92cd786257cc" path="/var/lib/kubelet/pods/3572a677-8591-4227-9cc8-92cd786257cc/volumes" Apr 19 12:18:38.961330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961296 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:38.961770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961685 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="882f21fe-f287-438b-89d5-694148e82a25" containerName="manager" Apr 19 12:18:38.961770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961698 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="882f21fe-f287-438b-89d5-694148e82a25" containerName="manager" Apr 19 12:18:38.961770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961706 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3572a677-8591-4227-9cc8-92cd786257cc" containerName="manager" Apr 19 12:18:38.961770 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961712 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3572a677-8591-4227-9cc8-92cd786257cc" containerName="manager" Apr 19 12:18:38.961904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961785 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3572a677-8591-4227-9cc8-92cd786257cc" containerName="manager" Apr 19 12:18:38.961904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.961793 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="882f21fe-f287-438b-89d5-694148e82a25" containerName="manager" Apr 19 12:18:38.964829 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.964814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:38.967293 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.967268 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 19 12:18:38.968025 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.968002 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-6x9x6\"" Apr 19 12:18:38.970481 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:38.970458 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:39.042542 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.042513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.042767 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.042599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbp4s\" (UniqueName: \"kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.066856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.066824 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:39.143580 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.143551 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.143782 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.143602 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbp4s\" (UniqueName: \"kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.144206 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.144181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.150663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.150640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbp4s\" (UniqueName: \"kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s\") pod \"limitador-limitador-7d549b5b-t29tf\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.275609 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.275530 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:39.403944 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.403915 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:39.405671 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:18:39.405643 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d41a062_33f2_4af6_b029_5cad68c33465.slice/crio-4065c77936d23a039ff65ceee31dd1fe78048ab67ede077978f2dc34dc3c6119 WatchSource:0}: Error finding container 4065c77936d23a039ff65ceee31dd1fe78048ab67ede077978f2dc34dc3c6119: Status 404 returned error can't find the container with id 4065c77936d23a039ff65ceee31dd1fe78048ab67ede077978f2dc34dc3c6119 Apr 19 12:18:39.966306 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:39.966270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" event={"ID":"6d41a062-33f2-4af6-b029-5cad68c33465","Type":"ContainerStarted","Data":"4065c77936d23a039ff65ceee31dd1fe78048ab67ede077978f2dc34dc3c6119"} Apr 19 12:18:42.980704 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:42.980668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" event={"ID":"6d41a062-33f2-4af6-b029-5cad68c33465","Type":"ContainerStarted","Data":"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba"} Apr 19 12:18:42.981088 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:42.980755 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:42.996576 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:42.996524 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" podStartSLOduration=2.322846216 podStartE2EDuration="4.996510621s" podCreationTimestamp="2026-04-19 12:18:38 +0000 UTC" firstStartedPulling="2026-04-19 12:18:39.407459159 +0000 UTC m=+538.126283292" lastFinishedPulling="2026-04-19 12:18:42.081123564 +0000 UTC m=+540.799947697" observedRunningTime="2026-04-19 12:18:42.994176049 +0000 UTC m=+541.713000217" watchObservedRunningTime="2026-04-19 12:18:42.996510621 +0000 UTC m=+541.715334777" Apr 19 12:18:53.776006 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:53.775967 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:53.776410 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:53.776217 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" podUID="6d41a062-33f2-4af6-b029-5cad68c33465" containerName="limitador" containerID="cri-o://192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba" gracePeriod=30 Apr 19 12:18:53.776902 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:53.776882 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:54.311598 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.311576 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:54.375453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.375387 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file\") pod \"6d41a062-33f2-4af6-b029-5cad68c33465\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " Apr 19 12:18:54.375569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.375485 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbp4s\" (UniqueName: \"kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s\") pod \"6d41a062-33f2-4af6-b029-5cad68c33465\" (UID: \"6d41a062-33f2-4af6-b029-5cad68c33465\") " Apr 19 12:18:54.375800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.375781 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file" (OuterVolumeSpecName: "config-file") pod "6d41a062-33f2-4af6-b029-5cad68c33465" (UID: "6d41a062-33f2-4af6-b029-5cad68c33465"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:18:54.377402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.377381 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s" (OuterVolumeSpecName: "kube-api-access-kbp4s") pod "6d41a062-33f2-4af6-b029-5cad68c33465" (UID: "6d41a062-33f2-4af6-b029-5cad68c33465"). InnerVolumeSpecName "kube-api-access-kbp4s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:18:54.476347 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.476320 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbp4s\" (UniqueName: \"kubernetes.io/projected/6d41a062-33f2-4af6-b029-5cad68c33465-kube-api-access-kbp4s\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:54.476347 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:54.476344 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6d41a062-33f2-4af6-b029-5cad68c33465-config-file\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:18:55.025238 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.025206 2568 generic.go:358] "Generic (PLEG): container finished" podID="6d41a062-33f2-4af6-b029-5cad68c33465" containerID="192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba" exitCode=0 Apr 19 12:18:55.025675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.025275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" event={"ID":"6d41a062-33f2-4af6-b029-5cad68c33465","Type":"ContainerDied","Data":"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba"} Apr 19 12:18:55.025675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.025286 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" Apr 19 12:18:55.025675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.025301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-t29tf" event={"ID":"6d41a062-33f2-4af6-b029-5cad68c33465","Type":"ContainerDied","Data":"4065c77936d23a039ff65ceee31dd1fe78048ab67ede077978f2dc34dc3c6119"} Apr 19 12:18:55.025675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.025318 2568 scope.go:117] "RemoveContainer" containerID="192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba" Apr 19 12:18:55.034277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.034259 2568 scope.go:117] "RemoveContainer" containerID="192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba" Apr 19 12:18:55.034537 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:18:55.034515 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba\": container with ID starting with 192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba not found: ID does not exist" containerID="192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba" Apr 19 12:18:55.034592 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.034568 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba"} err="failed to get container status \"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba\": rpc error: code = NotFound desc = could not find container \"192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba\": container with ID starting with 192a09f04e275263e1771e9eda7add5b42dd515dd91fd7b25d2ce6d9c91bbbba not found: ID does not exist" Apr 19 12:18:55.045807 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.045786 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:55.049250 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.049227 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-t29tf"] Apr 19 12:18:55.142530 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.142499 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-8x6qm"] Apr 19 12:18:55.142932 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.142918 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d41a062-33f2-4af6-b029-5cad68c33465" containerName="limitador" Apr 19 12:18:55.143003 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.142933 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d41a062-33f2-4af6-b029-5cad68c33465" containerName="limitador" Apr 19 12:18:55.143003 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.142997 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d41a062-33f2-4af6-b029-5cad68c33465" containerName="limitador" Apr 19 12:18:55.147457 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.147437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.149688 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.149660 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 19 12:18:55.149830 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.149702 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-bdwwc\"" Apr 19 12:18:55.155230 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.155204 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-8x6qm"] Apr 19 12:18:55.284504 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.284424 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-data\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.284504 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.284478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlsk6\" (UniqueName: \"kubernetes.io/projected/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-kube-api-access-xlsk6\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.385308 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.385274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-data\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.385484 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.385327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlsk6\" (UniqueName: \"kubernetes.io/projected/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-kube-api-access-xlsk6\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.385708 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.385685 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-data\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.392874 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.392853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlsk6\" (UniqueName: \"kubernetes.io/projected/5577bc4c-fe2e-4f1f-857f-c5fa86753ea5-kube-api-access-xlsk6\") pod \"postgres-868db5846d-8x6qm\" (UID: \"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5\") " pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.460225 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.460199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:18:55.579191 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.579164 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-8x6qm"] Apr 19 12:18:55.580404 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:18:55.580381 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5577bc4c_fe2e_4f1f_857f_c5fa86753ea5.slice/crio-ae4c2948f7f3b27f37768b5e39b961c81cebfed2f7cc9080ed5cf5ca0cdc79a5 WatchSource:0}: Error finding container ae4c2948f7f3b27f37768b5e39b961c81cebfed2f7cc9080ed5cf5ca0cdc79a5: Status 404 returned error can't find the container with id ae4c2948f7f3b27f37768b5e39b961c81cebfed2f7cc9080ed5cf5ca0cdc79a5 Apr 19 12:18:55.869355 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:55.869276 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d41a062-33f2-4af6-b029-5cad68c33465" path="/var/lib/kubelet/pods/6d41a062-33f2-4af6-b029-5cad68c33465/volumes" Apr 19 12:18:56.029983 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:18:56.029951 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-8x6qm" event={"ID":"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5","Type":"ContainerStarted","Data":"ae4c2948f7f3b27f37768b5e39b961c81cebfed2f7cc9080ed5cf5ca0cdc79a5"} Apr 19 12:19:19.125180 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:19.125136 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-8x6qm" event={"ID":"5577bc4c-fe2e-4f1f-857f-c5fa86753ea5","Type":"ContainerStarted","Data":"900338d7976c502bb7fa9df87dbf8b8b7e73c9c96c9b1a9aab46301e13513d33"} Apr 19 12:19:19.125728 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:19.125237 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:19:19.141554 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:19.141508 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-8x6qm" podStartSLOduration=1.4294136960000001 podStartE2EDuration="24.141497827s" podCreationTimestamp="2026-04-19 12:18:55 +0000 UTC" firstStartedPulling="2026-04-19 12:18:55.581800854 +0000 UTC m=+554.300624990" lastFinishedPulling="2026-04-19 12:19:18.293884973 +0000 UTC m=+577.012709121" observedRunningTime="2026-04-19 12:19:19.138689216 +0000 UTC m=+577.857513371" watchObservedRunningTime="2026-04-19 12:19:19.141497827 +0000 UTC m=+577.860321983" Apr 19 12:19:25.157477 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:25.157450 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-8x6qm" Apr 19 12:19:27.947937 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:27.947899 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:27.951660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:27.951613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:27.953811 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:27.953788 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-t49hl\"" Apr 19 12:19:27.960225 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:27.960196 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:28.067054 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.067017 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8jx\" (UniqueName: \"kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx\") pod \"maas-controller-6d4c8f55f9-8q7t5\" (UID: \"e9feb56a-48d1-4a6f-af67-d9841d5e2989\") " pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:28.077222 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.077192 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:28.080808 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.080790 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:28.093405 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.093387 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:28.167743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.167712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngqzw\" (UniqueName: \"kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw\") pod \"maas-controller-67f974db-sdpwd\" (UID: \"ca82cde1-655c-4734-90d8-ecb177aa2983\") " pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:28.167904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.167830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv8jx\" (UniqueName: \"kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx\") pod \"maas-controller-6d4c8f55f9-8q7t5\" (UID: \"e9feb56a-48d1-4a6f-af67-d9841d5e2989\") " pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:28.176850 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.176825 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv8jx\" (UniqueName: \"kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx\") pod \"maas-controller-6d4c8f55f9-8q7t5\" (UID: \"e9feb56a-48d1-4a6f-af67-d9841d5e2989\") " pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:28.189056 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.189032 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:28.189292 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.189280 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:28.219277 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.219252 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:28.223817 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.223798 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:28.230886 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.230864 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:28.268526 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.268495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngqzw\" (UniqueName: \"kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw\") pod \"maas-controller-67f974db-sdpwd\" (UID: \"ca82cde1-655c-4734-90d8-ecb177aa2983\") " pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:28.268707 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.268569 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk6wt\" (UniqueName: \"kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt\") pod \"maas-controller-58dcf97978-9t8rf\" (UID: \"dbde9355-42b1-4770-b91a-3f35cd5dfcad\") " pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:28.276882 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.276831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngqzw\" (UniqueName: \"kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw\") pod \"maas-controller-67f974db-sdpwd\" (UID: \"ca82cde1-655c-4734-90d8-ecb177aa2983\") " pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:28.315650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.315608 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:28.316427 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:19:28.316400 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9feb56a_48d1_4a6f_af67_d9841d5e2989.slice/crio-578cc0c33b89da3e704ae23ec7430255c4da79ce81bd34c6e37a9f5da45d34e3 WatchSource:0}: Error finding container 578cc0c33b89da3e704ae23ec7430255c4da79ce81bd34c6e37a9f5da45d34e3: Status 404 returned error can't find the container with id 578cc0c33b89da3e704ae23ec7430255c4da79ce81bd34c6e37a9f5da45d34e3 Apr 19 12:19:28.369682 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.369652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk6wt\" (UniqueName: \"kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt\") pod \"maas-controller-58dcf97978-9t8rf\" (UID: \"dbde9355-42b1-4770-b91a-3f35cd5dfcad\") " pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:28.376593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.376570 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk6wt\" (UniqueName: \"kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt\") pod \"maas-controller-58dcf97978-9t8rf\" (UID: \"dbde9355-42b1-4770-b91a-3f35cd5dfcad\") " pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:28.391367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.391348 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:28.508965 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.508943 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:28.510407 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:19:28.510374 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca82cde1_655c_4734_90d8_ecb177aa2983.slice/crio-c02a048d659e9651652e0f3f39185ac7022051634c2f94e96a4c07216d697713 WatchSource:0}: Error finding container c02a048d659e9651652e0f3f39185ac7022051634c2f94e96a4c07216d697713: Status 404 returned error can't find the container with id c02a048d659e9651652e0f3f39185ac7022051634c2f94e96a4c07216d697713 Apr 19 12:19:28.539361 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.539336 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:28.662867 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:28.662833 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:28.664133 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:19:28.664100 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbde9355_42b1_4770_b91a_3f35cd5dfcad.slice/crio-47da7604d8ec64ef16c773692a2d3dfc49adb03851a74e31ce5b089bf52253f0 WatchSource:0}: Error finding container 47da7604d8ec64ef16c773692a2d3dfc49adb03851a74e31ce5b089bf52253f0: Status 404 returned error can't find the container with id 47da7604d8ec64ef16c773692a2d3dfc49adb03851a74e31ce5b089bf52253f0 Apr 19 12:19:29.169156 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:29.169115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58dcf97978-9t8rf" event={"ID":"dbde9355-42b1-4770-b91a-3f35cd5dfcad","Type":"ContainerStarted","Data":"47da7604d8ec64ef16c773692a2d3dfc49adb03851a74e31ce5b089bf52253f0"} Apr 19 12:19:29.174544 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:29.174508 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67f974db-sdpwd" event={"ID":"ca82cde1-655c-4734-90d8-ecb177aa2983","Type":"ContainerStarted","Data":"c02a048d659e9651652e0f3f39185ac7022051634c2f94e96a4c07216d697713"} Apr 19 12:19:29.177412 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:29.177386 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" event={"ID":"e9feb56a-48d1-4a6f-af67-d9841d5e2989","Type":"ContainerStarted","Data":"578cc0c33b89da3e704ae23ec7430255c4da79ce81bd34c6e37a9f5da45d34e3"} Apr 19 12:19:32.194434 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.194401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" event={"ID":"e9feb56a-48d1-4a6f-af67-d9841d5e2989","Type":"ContainerStarted","Data":"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e"} Apr 19 12:19:32.194957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.194463 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" podUID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" containerName="manager" containerID="cri-o://e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e" gracePeriod=10 Apr 19 12:19:32.194957 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.194516 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:32.195801 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.195771 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58dcf97978-9t8rf" event={"ID":"dbde9355-42b1-4770-b91a-3f35cd5dfcad","Type":"ContainerStarted","Data":"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589"} Apr 19 12:19:32.195927 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.195817 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:32.197231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.197205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67f974db-sdpwd" event={"ID":"ca82cde1-655c-4734-90d8-ecb177aa2983","Type":"ContainerStarted","Data":"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e"} Apr 19 12:19:32.197399 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.197381 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:32.211451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.211416 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" podStartSLOduration=1.981253344 podStartE2EDuration="5.211404202s" podCreationTimestamp="2026-04-19 12:19:27 +0000 UTC" firstStartedPulling="2026-04-19 12:19:28.317610863 +0000 UTC m=+587.036434999" lastFinishedPulling="2026-04-19 12:19:31.547761714 +0000 UTC m=+590.266585857" observedRunningTime="2026-04-19 12:19:32.207473318 +0000 UTC m=+590.926297575" watchObservedRunningTime="2026-04-19 12:19:32.211404202 +0000 UTC m=+590.930228356" Apr 19 12:19:32.221766 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.221721 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-67f974db-sdpwd" podStartSLOduration=1.172327427 podStartE2EDuration="4.221706021s" podCreationTimestamp="2026-04-19 12:19:28 +0000 UTC" firstStartedPulling="2026-04-19 12:19:28.511733257 +0000 UTC m=+587.230557393" lastFinishedPulling="2026-04-19 12:19:31.561111835 +0000 UTC m=+590.279935987" observedRunningTime="2026-04-19 12:19:32.221520784 +0000 UTC m=+590.940344940" watchObservedRunningTime="2026-04-19 12:19:32.221706021 +0000 UTC m=+590.940530178" Apr 19 12:19:32.240099 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.240037 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-58dcf97978-9t8rf" podStartSLOduration=1.358429449 podStartE2EDuration="4.240022795s" podCreationTimestamp="2026-04-19 12:19:28 +0000 UTC" firstStartedPulling="2026-04-19 12:19:28.66590597 +0000 UTC m=+587.384730108" lastFinishedPulling="2026-04-19 12:19:31.547499318 +0000 UTC m=+590.266323454" observedRunningTime="2026-04-19 12:19:32.239583575 +0000 UTC m=+590.958407730" watchObservedRunningTime="2026-04-19 12:19:32.240022795 +0000 UTC m=+590.958846952" Apr 19 12:19:32.476376 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.476355 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:32.612172 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.612138 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv8jx\" (UniqueName: \"kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx\") pod \"e9feb56a-48d1-4a6f-af67-d9841d5e2989\" (UID: \"e9feb56a-48d1-4a6f-af67-d9841d5e2989\") " Apr 19 12:19:32.614254 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.614226 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx" (OuterVolumeSpecName: "kube-api-access-zv8jx") pod "e9feb56a-48d1-4a6f-af67-d9841d5e2989" (UID: "e9feb56a-48d1-4a6f-af67-d9841d5e2989"). InnerVolumeSpecName "kube-api-access-zv8jx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:32.713015 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:32.712918 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zv8jx\" (UniqueName: \"kubernetes.io/projected/e9feb56a-48d1-4a6f-af67-d9841d5e2989-kube-api-access-zv8jx\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:19:33.202072 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.202040 2568 generic.go:358] "Generic (PLEG): container finished" podID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" containerID="e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e" exitCode=0 Apr 19 12:19:33.202529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.202125 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" Apr 19 12:19:33.202529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.202134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" event={"ID":"e9feb56a-48d1-4a6f-af67-d9841d5e2989","Type":"ContainerDied","Data":"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e"} Apr 19 12:19:33.202529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.202181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-8q7t5" event={"ID":"e9feb56a-48d1-4a6f-af67-d9841d5e2989","Type":"ContainerDied","Data":"578cc0c33b89da3e704ae23ec7430255c4da79ce81bd34c6e37a9f5da45d34e3"} Apr 19 12:19:33.202529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.202203 2568 scope.go:117] "RemoveContainer" containerID="e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e" Apr 19 12:19:33.211735 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.211721 2568 scope.go:117] "RemoveContainer" containerID="e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e" Apr 19 12:19:33.211984 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:19:33.211965 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e\": container with ID starting with e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e not found: ID does not exist" containerID="e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e" Apr 19 12:19:33.212030 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.211993 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e"} err="failed to get container status \"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e\": rpc error: code = NotFound desc = could not find container \"e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e\": container with ID starting with e3bed121f2c5d1513b41ba237069b503655c8c26392951360196b761a028382e not found: ID does not exist" Apr 19 12:19:33.222800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.222780 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:33.226061 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.226043 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-8q7t5"] Apr 19 12:19:33.748094 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.748053 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:19:33.748538 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.748521 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" containerName="manager" Apr 19 12:19:33.748649 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.748540 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" containerName="manager" Apr 19 12:19:33.748714 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.748680 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" containerName="manager" Apr 19 12:19:33.753159 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.753136 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:33.755528 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.755492 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 19 12:19:33.755676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.755545 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 19 12:19:33.755676 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.755553 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7sgxc\"" Apr 19 12:19:33.758565 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.758533 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:19:33.822161 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.822132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnnlv\" (UniqueName: \"kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:33.822318 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.822227 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:33.869250 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.869223 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9feb56a-48d1-4a6f-af67-d9841d5e2989" path="/var/lib/kubelet/pods/e9feb56a-48d1-4a6f-af67-d9841d5e2989/volumes" Apr 19 12:19:33.923479 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.923446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:33.923685 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.923552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnnlv\" (UniqueName: \"kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:33.923685 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:19:33.923601 2568 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 19 12:19:33.923822 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:19:33.923707 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls podName:c8645dd9-d806-4aa1-a434-14fceea686b6 nodeName:}" failed. No retries permitted until 2026-04-19 12:19:34.423684196 +0000 UTC m=+593.142508335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls") pod "maas-api-7648cb8989-xmjbt" (UID: "c8645dd9-d806-4aa1-a434-14fceea686b6") : secret "maas-api-serving-cert" not found Apr 19 12:19:33.933121 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:33.933088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnnlv\" (UniqueName: \"kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:34.428970 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:34.428932 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:34.431289 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:34.431266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") pod \"maas-api-7648cb8989-xmjbt\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:34.665303 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:34.665269 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:34.786471 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:34.786441 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:19:34.787449 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:19:34.787424 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8645dd9_d806_4aa1_a434_14fceea686b6.slice/crio-e47d1ebd861a7701e048d9049d2a96ba78ca73dd31813b6bbb71ac04ff5ffcb5 WatchSource:0}: Error finding container e47d1ebd861a7701e048d9049d2a96ba78ca73dd31813b6bbb71ac04ff5ffcb5: Status 404 returned error can't find the container with id e47d1ebd861a7701e048d9049d2a96ba78ca73dd31813b6bbb71ac04ff5ffcb5 Apr 19 12:19:35.211186 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:35.211155 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7648cb8989-xmjbt" event={"ID":"c8645dd9-d806-4aa1-a434-14fceea686b6","Type":"ContainerStarted","Data":"e47d1ebd861a7701e048d9049d2a96ba78ca73dd31813b6bbb71ac04ff5ffcb5"} Apr 19 12:19:37.221499 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:37.221463 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7648cb8989-xmjbt" event={"ID":"c8645dd9-d806-4aa1-a434-14fceea686b6","Type":"ContainerStarted","Data":"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a"} Apr 19 12:19:37.221908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:37.221518 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:37.237991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:37.237945 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7648cb8989-xmjbt" podStartSLOduration=2.778791086 podStartE2EDuration="4.237932489s" podCreationTimestamp="2026-04-19 12:19:33 +0000 UTC" firstStartedPulling="2026-04-19 12:19:34.788861282 +0000 UTC m=+593.507685415" lastFinishedPulling="2026-04-19 12:19:36.248002683 +0000 UTC m=+594.966826818" observedRunningTime="2026-04-19 12:19:37.235830375 +0000 UTC m=+595.954654531" watchObservedRunningTime="2026-04-19 12:19:37.237932489 +0000 UTC m=+595.956756643" Apr 19 12:19:43.207354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.207322 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:43.209706 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.207375 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:43.231340 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.231318 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:19:43.265309 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.265279 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:43.265486 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.265466 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-67f974db-sdpwd" podUID="ca82cde1-655c-4734-90d8-ecb177aa2983" containerName="manager" containerID="cri-o://766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e" gracePeriod=10 Apr 19 12:19:43.506756 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.506733 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:43.536133 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.536108 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:19:43.536510 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.536497 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca82cde1-655c-4734-90d8-ecb177aa2983" containerName="manager" Apr 19 12:19:43.536554 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.536512 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca82cde1-655c-4734-90d8-ecb177aa2983" containerName="manager" Apr 19 12:19:43.536599 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.536589 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca82cde1-655c-4734-90d8-ecb177aa2983" containerName="manager" Apr 19 12:19:43.539672 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.539657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:43.549443 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.549421 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:19:43.610883 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.610852 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngqzw\" (UniqueName: \"kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw\") pod \"ca82cde1-655c-4734-90d8-ecb177aa2983\" (UID: \"ca82cde1-655c-4734-90d8-ecb177aa2983\") " Apr 19 12:19:43.611109 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.611090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46nk\" (UniqueName: \"kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk\") pod \"maas-controller-54dc75dff7-rxbn6\" (UID: \"f2fae46d-7d31-40b0-baf3-5107a28bb62a\") " pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:43.612845 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.612823 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw" (OuterVolumeSpecName: "kube-api-access-ngqzw") pod "ca82cde1-655c-4734-90d8-ecb177aa2983" (UID: "ca82cde1-655c-4734-90d8-ecb177aa2983"). InnerVolumeSpecName "kube-api-access-ngqzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:43.712238 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.712211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l46nk\" (UniqueName: \"kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk\") pod \"maas-controller-54dc75dff7-rxbn6\" (UID: \"f2fae46d-7d31-40b0-baf3-5107a28bb62a\") " pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:43.712368 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.712263 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngqzw\" (UniqueName: \"kubernetes.io/projected/ca82cde1-655c-4734-90d8-ecb177aa2983-kube-api-access-ngqzw\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:19:43.719910 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.719880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l46nk\" (UniqueName: \"kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk\") pod \"maas-controller-54dc75dff7-rxbn6\" (UID: \"f2fae46d-7d31-40b0-baf3-5107a28bb62a\") " pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:43.853480 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.853399 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:43.974767 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.974742 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:19:43.975596 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:19:43.975578 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2fae46d_7d31_40b0_baf3_5107a28bb62a.slice/crio-df0068fba45e50aae2f90270aa39a512c6cf4613145077b435024dd4546ae698 WatchSource:0}: Error finding container df0068fba45e50aae2f90270aa39a512c6cf4613145077b435024dd4546ae698: Status 404 returned error can't find the container with id df0068fba45e50aae2f90270aa39a512c6cf4613145077b435024dd4546ae698 Apr 19 12:19:43.976984 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:43.976967 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:19:44.252476 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.252443 2568 generic.go:358] "Generic (PLEG): container finished" podID="ca82cde1-655c-4734-90d8-ecb177aa2983" containerID="766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e" exitCode=0 Apr 19 12:19:44.252920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.252505 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67f974db-sdpwd" event={"ID":"ca82cde1-655c-4734-90d8-ecb177aa2983","Type":"ContainerDied","Data":"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e"} Apr 19 12:19:44.252920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.252508 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67f974db-sdpwd" Apr 19 12:19:44.252920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.252547 2568 scope.go:117] "RemoveContainer" containerID="766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e" Apr 19 12:19:44.252920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.252537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67f974db-sdpwd" event={"ID":"ca82cde1-655c-4734-90d8-ecb177aa2983","Type":"ContainerDied","Data":"c02a048d659e9651652e0f3f39185ac7022051634c2f94e96a4c07216d697713"} Apr 19 12:19:44.253667 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.253646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" event={"ID":"f2fae46d-7d31-40b0-baf3-5107a28bb62a","Type":"ContainerStarted","Data":"df0068fba45e50aae2f90270aa39a512c6cf4613145077b435024dd4546ae698"} Apr 19 12:19:44.261152 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.261136 2568 scope.go:117] "RemoveContainer" containerID="766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e" Apr 19 12:19:44.261385 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:19:44.261366 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e\": container with ID starting with 766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e not found: ID does not exist" containerID="766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e" Apr 19 12:19:44.261448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.261400 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e"} err="failed to get container status \"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e\": rpc error: code = NotFound desc = could not find container \"766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e\": container with ID starting with 766f168a618b25ad53ed1acf360baa4b19442a3d6f41c69acd1a23278b83433e not found: ID does not exist" Apr 19 12:19:44.268199 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.268174 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:44.271029 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:44.271008 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-67f974db-sdpwd"] Apr 19 12:19:45.260063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:45.260035 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" event={"ID":"f2fae46d-7d31-40b0-baf3-5107a28bb62a","Type":"ContainerStarted","Data":"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5"} Apr 19 12:19:45.260428 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:45.260074 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:45.275307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:45.275262 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" podStartSLOduration=1.9511144740000002 podStartE2EDuration="2.275247813s" podCreationTimestamp="2026-04-19 12:19:43 +0000 UTC" firstStartedPulling="2026-04-19 12:19:43.977088061 +0000 UTC m=+602.695912193" lastFinishedPulling="2026-04-19 12:19:44.301221395 +0000 UTC m=+603.020045532" observedRunningTime="2026-04-19 12:19:45.273699188 +0000 UTC m=+603.992523343" watchObservedRunningTime="2026-04-19 12:19:45.275247813 +0000 UTC m=+603.994072029" Apr 19 12:19:45.870156 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:45.870120 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca82cde1-655c-4734-90d8-ecb177aa2983" path="/var/lib/kubelet/pods/ca82cde1-655c-4734-90d8-ecb177aa2983/volumes" Apr 19 12:19:56.269008 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.268925 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:19:56.310226 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.310193 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:56.310468 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.310444 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-58dcf97978-9t8rf" podUID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" containerName="manager" containerID="cri-o://e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589" gracePeriod=10 Apr 19 12:19:56.551145 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.551124 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:56.727956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.727925 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk6wt\" (UniqueName: \"kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt\") pod \"dbde9355-42b1-4770-b91a-3f35cd5dfcad\" (UID: \"dbde9355-42b1-4770-b91a-3f35cd5dfcad\") " Apr 19 12:19:56.729947 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.729923 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt" (OuterVolumeSpecName: "kube-api-access-zk6wt") pod "dbde9355-42b1-4770-b91a-3f35cd5dfcad" (UID: "dbde9355-42b1-4770-b91a-3f35cd5dfcad"). InnerVolumeSpecName "kube-api-access-zk6wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:56.829646 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:56.829536 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zk6wt\" (UniqueName: \"kubernetes.io/projected/dbde9355-42b1-4770-b91a-3f35cd5dfcad-kube-api-access-zk6wt\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:19:57.314451 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.314412 2568 generic.go:358] "Generic (PLEG): container finished" podID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" containerID="e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589" exitCode=0 Apr 19 12:19:57.314904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.314477 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58dcf97978-9t8rf" Apr 19 12:19:57.314904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.314498 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58dcf97978-9t8rf" event={"ID":"dbde9355-42b1-4770-b91a-3f35cd5dfcad","Type":"ContainerDied","Data":"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589"} Apr 19 12:19:57.314904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.314535 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58dcf97978-9t8rf" event={"ID":"dbde9355-42b1-4770-b91a-3f35cd5dfcad","Type":"ContainerDied","Data":"47da7604d8ec64ef16c773692a2d3dfc49adb03851a74e31ce5b089bf52253f0"} Apr 19 12:19:57.314904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.314552 2568 scope.go:117] "RemoveContainer" containerID="e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589" Apr 19 12:19:57.323334 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.323320 2568 scope.go:117] "RemoveContainer" containerID="e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589" Apr 19 12:19:57.323583 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:19:57.323567 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589\": container with ID starting with e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589 not found: ID does not exist" containerID="e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589" Apr 19 12:19:57.323650 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.323590 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589"} err="failed to get container status \"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589\": rpc error: code = NotFound desc = could not find container \"e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589\": container with ID starting with e25f5de3749430e5ef659acd92c019e8e1f78536f485442631fd075891d66589 not found: ID does not exist" Apr 19 12:19:57.336418 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.336397 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:57.343378 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.343358 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-58dcf97978-9t8rf"] Apr 19 12:19:57.869202 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:19:57.869166 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" path="/var/lib/kubelet/pods/dbde9355-42b1-4770-b91a-3f35cd5dfcad/volumes" Apr 19 12:20:11.368862 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.368825 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6"] Apr 19 12:20:11.369220 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.369204 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" containerName="manager" Apr 19 12:20:11.369220 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.369214 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" containerName="manager" Apr 19 12:20:11.369295 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.369277 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="dbde9355-42b1-4770-b91a-3f35cd5dfcad" containerName="manager" Apr 19 12:20:11.377147 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.377126 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.380392 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.380367 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 19 12:20:11.380497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.380367 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-s696f\"" Apr 19 12:20:11.380562 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.380375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 19 12:20:11.380665 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.380643 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 19 12:20:11.381313 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.381288 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6"] Apr 19 12:20:11.437749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.437718 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.437876 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.437763 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.437876 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.437841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.437997 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.437895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb97\" (UniqueName: \"kubernetes.io/projected/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kube-api-access-swb97\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.437997 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.437933 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.438068 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.438011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.539846 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.539367 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swb97\" (UniqueName: \"kubernetes.io/projected/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kube-api-access-swb97\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.539899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.539970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540143 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.540036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540143 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.540079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.540164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540666 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.540619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.540943 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.540919 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.544642 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.541577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.544642 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.543950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.544642 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.544163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.548423 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.548397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb97\" (UniqueName: \"kubernetes.io/projected/62e1fbc8-9352-4ede-847f-5f3523ddd9d7-kube-api-access-swb97\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6\" (UID: \"62e1fbc8-9352-4ede-847f-5f3523ddd9d7\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.689427 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.689397 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:11.821171 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:11.821144 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6"] Apr 19 12:20:11.823671 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:20:11.823640 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e1fbc8_9352_4ede_847f_5f3523ddd9d7.slice/crio-a59b1dec656881766b9696df4297c2613f15bc0ce3148586c476e24fa00a6da3 WatchSource:0}: Error finding container a59b1dec656881766b9696df4297c2613f15bc0ce3148586c476e24fa00a6da3: Status 404 returned error can't find the container with id a59b1dec656881766b9696df4297c2613f15bc0ce3148586c476e24fa00a6da3 Apr 19 12:20:12.267374 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.267343 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5"] Apr 19 12:20:12.273003 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.272983 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.275513 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.275487 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 19 12:20:12.278895 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.278874 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5"] Apr 19 12:20:12.345808 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.345777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86t69\" (UniqueName: \"kubernetes.io/projected/0df244f0-d09b-4e97-9d48-46e4ba43877f-kube-api-access-86t69\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.345954 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.345829 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.345954 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.345906 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.345954 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.345949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df244f0-d09b-4e97-9d48-46e4ba43877f-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.346093 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.345973 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.346093 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.346019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.385035 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.385000 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" event={"ID":"62e1fbc8-9352-4ede-847f-5f3523ddd9d7","Type":"ContainerStarted","Data":"a59b1dec656881766b9696df4297c2613f15bc0ce3148586c476e24fa00a6da3"} Apr 19 12:20:12.446423 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446398 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86t69\" (UniqueName: \"kubernetes.io/projected/0df244f0-d09b-4e97-9d48-46e4ba43877f-kube-api-access-86t69\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df244f0-d09b-4e97-9d48-46e4ba43877f-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446509 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446553 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446889 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446868 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446935 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446909 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.446980 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.446960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.448708 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.448684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0df244f0-d09b-4e97-9d48-46e4ba43877f-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.448945 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.448930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0df244f0-d09b-4e97-9d48-46e4ba43877f-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.456609 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.456585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86t69\" (UniqueName: \"kubernetes.io/projected/0df244f0-d09b-4e97-9d48-46e4ba43877f-kube-api-access-86t69\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5\" (UID: \"0df244f0-d09b-4e97-9d48-46e4ba43877f\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.586280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.586183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:12.729609 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:12.727144 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5"] Apr 19 12:20:13.391219 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:13.391186 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" event={"ID":"0df244f0-d09b-4e97-9d48-46e4ba43877f","Type":"ContainerStarted","Data":"fcb6e759553ae7294f3a56d98af2ecc04516c140e77895a2ece443c4ea92126c"} Apr 19 12:20:16.108721 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.108668 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-d69f7d5bf-c5mss"] Apr 19 12:20:16.118114 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.118087 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-d69f7d5bf-c5mss"] Apr 19 12:20:16.118242 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.118218 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.180666 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.180614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr6t8\" (UniqueName: \"kubernetes.io/projected/46397433-f672-4827-bf15-5b131629f949-kube-api-access-cr6t8\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.180832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.180721 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/46397433-f672-4827-bf15-5b131629f949-maas-api-tls\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.281614 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.281574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr6t8\" (UniqueName: \"kubernetes.io/projected/46397433-f672-4827-bf15-5b131629f949-kube-api-access-cr6t8\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.281813 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.281653 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/46397433-f672-4827-bf15-5b131629f949-maas-api-tls\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.284326 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.284294 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/46397433-f672-4827-bf15-5b131629f949-maas-api-tls\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.288937 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.288895 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr6t8\" (UniqueName: \"kubernetes.io/projected/46397433-f672-4827-bf15-5b131629f949-kube-api-access-cr6t8\") pod \"maas-api-d69f7d5bf-c5mss\" (UID: \"46397433-f672-4827-bf15-5b131629f949\") " pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:16.439334 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:16.439308 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:18.228313 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:18.227598 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-d69f7d5bf-c5mss"] Apr 19 12:20:18.418641 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:18.418527 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d69f7d5bf-c5mss" event={"ID":"46397433-f672-4827-bf15-5b131629f949","Type":"ContainerStarted","Data":"662fe0501a3b0e3575c9f445158aba6117e9711b867fc6df1739124e8894d81d"} Apr 19 12:20:18.420070 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:18.420044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" event={"ID":"0df244f0-d09b-4e97-9d48-46e4ba43877f","Type":"ContainerStarted","Data":"ace2c15a0d1cd86abe78571037b1944dccccec90e6186919246359f43133707a"} Apr 19 12:20:18.421533 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:18.421504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" event={"ID":"62e1fbc8-9352-4ede-847f-5f3523ddd9d7","Type":"ContainerStarted","Data":"566bad11df07c22e163c266c9593a1a5fbf4bf8f3d837d8c12c73c1932975cef"} Apr 19 12:20:20.438844 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.438802 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-d69f7d5bf-c5mss" event={"ID":"46397433-f672-4827-bf15-5b131629f949","Type":"ContainerStarted","Data":"6b5917aba699cfce5f2a9575421a1fa9bfeb293ddc588fe26e83880a65863d32"} Apr 19 12:20:20.439651 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.439614 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:20.454356 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.454302 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-d69f7d5bf-c5mss" podStartSLOduration=2.849529274 podStartE2EDuration="4.454283965s" podCreationTimestamp="2026-04-19 12:20:16 +0000 UTC" firstStartedPulling="2026-04-19 12:20:18.239883307 +0000 UTC m=+636.958707457" lastFinishedPulling="2026-04-19 12:20:19.844638012 +0000 UTC m=+638.563462148" observedRunningTime="2026-04-19 12:20:20.45380696 +0000 UTC m=+639.172631116" watchObservedRunningTime="2026-04-19 12:20:20.454283965 +0000 UTC m=+639.173108122" Apr 19 12:20:20.868907 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.868826 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn"] Apr 19 12:20:20.872582 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.872566 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.874789 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.874768 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 19 12:20:20.879861 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.879838 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn"] Apr 19 12:20:20.928322 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928280 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/582f660a-326a-427a-9102-b23ffada154c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.928518 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928415 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlg4n\" (UniqueName: \"kubernetes.io/projected/582f660a-326a-427a-9102-b23ffada154c-kube-api-access-tlg4n\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.928518 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928484 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.928652 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.928717 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:20.928778 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:20.928751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.030856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.030812 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/582f660a-326a-427a-9102-b23ffada154c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.031151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.031129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlg4n\" (UniqueName: \"kubernetes.io/projected/582f660a-326a-427a-9102-b23ffada154c-kube-api-access-tlg4n\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.031327 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.031312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.031461 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.031445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.031594 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.031581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.031731 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.031717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.032490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.032075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.032490 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.032287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.032680 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.032530 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.041671 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.039473 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/582f660a-326a-427a-9102-b23ffada154c-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.052508 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.052447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/582f660a-326a-427a-9102-b23ffada154c-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.054611 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.054585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlg4n\" (UniqueName: \"kubernetes.io/projected/582f660a-326a-427a-9102-b23ffada154c-kube-api-access-tlg4n\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-wmwqn\" (UID: \"582f660a-326a-427a-9102-b23ffada154c\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.185249 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.185214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:21.330523 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.330479 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn"] Apr 19 12:20:21.342994 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:20:21.342939 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod582f660a_326a_427a_9102_b23ffada154c.slice/crio-835ecc6a00922709d9ccbbcfcdabecd70b67f530c0b4cbf57dab8800c2f1e6a1 WatchSource:0}: Error finding container 835ecc6a00922709d9ccbbcfcdabecd70b67f530c0b4cbf57dab8800c2f1e6a1: Status 404 returned error can't find the container with id 835ecc6a00922709d9ccbbcfcdabecd70b67f530c0b4cbf57dab8800c2f1e6a1 Apr 19 12:20:21.445586 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.445426 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" event={"ID":"582f660a-326a-427a-9102-b23ffada154c","Type":"ContainerStarted","Data":"bf91dbbea1edb87961d6b497481cf78fb9251d9134580a3dbcea1190ebc6732c"} Apr 19 12:20:21.445586 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:21.445478 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" event={"ID":"582f660a-326a-427a-9102-b23ffada154c","Type":"ContainerStarted","Data":"835ecc6a00922709d9ccbbcfcdabecd70b67f530c0b4cbf57dab8800c2f1e6a1"} Apr 19 12:20:24.461710 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:24.461673 2568 generic.go:358] "Generic (PLEG): container finished" podID="62e1fbc8-9352-4ede-847f-5f3523ddd9d7" containerID="566bad11df07c22e163c266c9593a1a5fbf4bf8f3d837d8c12c73c1932975cef" exitCode=0 Apr 19 12:20:24.462065 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:24.461757 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" event={"ID":"62e1fbc8-9352-4ede-847f-5f3523ddd9d7","Type":"ContainerDied","Data":"566bad11df07c22e163c266c9593a1a5fbf4bf8f3d837d8c12c73c1932975cef"} Apr 19 12:20:24.463347 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:24.463330 2568 generic.go:358] "Generic (PLEG): container finished" podID="0df244f0-d09b-4e97-9d48-46e4ba43877f" containerID="ace2c15a0d1cd86abe78571037b1944dccccec90e6186919246359f43133707a" exitCode=0 Apr 19 12:20:24.463405 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:24.463381 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" event={"ID":"0df244f0-d09b-4e97-9d48-46e4ba43877f","Type":"ContainerDied","Data":"ace2c15a0d1cd86abe78571037b1944dccccec90e6186919246359f43133707a"} Apr 19 12:20:26.473208 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.473175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" event={"ID":"62e1fbc8-9352-4ede-847f-5f3523ddd9d7","Type":"ContainerStarted","Data":"ea274b87e24e4ff0d04eaa504530e58ce279cf7a8f24d933d0a4ea9fab1fa55e"} Apr 19 12:20:26.473674 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.473402 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:26.474953 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.474930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" event={"ID":"0df244f0-d09b-4e97-9d48-46e4ba43877f","Type":"ContainerStarted","Data":"48f0f9d81fc98ca7b053fd695b011a811b3dbb18b0eef5324581a73604c8d1ca"} Apr 19 12:20:26.475119 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.475105 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:26.491063 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.491022 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" podStartSLOduration=1.420816829 podStartE2EDuration="15.491006342s" podCreationTimestamp="2026-04-19 12:20:11 +0000 UTC" firstStartedPulling="2026-04-19 12:20:11.827198993 +0000 UTC m=+630.546023138" lastFinishedPulling="2026-04-19 12:20:25.89738851 +0000 UTC m=+644.616212651" observedRunningTime="2026-04-19 12:20:26.489416138 +0000 UTC m=+645.208240293" watchObservedRunningTime="2026-04-19 12:20:26.491006342 +0000 UTC m=+645.209830502" Apr 19 12:20:26.506762 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:26.506718 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" podStartSLOduration=1.348794227 podStartE2EDuration="14.50670705s" podCreationTimestamp="2026-04-19 12:20:12 +0000 UTC" firstStartedPulling="2026-04-19 12:20:12.732853968 +0000 UTC m=+631.451678114" lastFinishedPulling="2026-04-19 12:20:25.890766799 +0000 UTC m=+644.609590937" observedRunningTime="2026-04-19 12:20:26.503579991 +0000 UTC m=+645.222404148" watchObservedRunningTime="2026-04-19 12:20:26.50670705 +0000 UTC m=+645.225531205" Apr 19 12:20:27.456340 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.456314 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-d69f7d5bf-c5mss" Apr 19 12:20:27.480359 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.480329 2568 generic.go:358] "Generic (PLEG): container finished" podID="582f660a-326a-427a-9102-b23ffada154c" containerID="bf91dbbea1edb87961d6b497481cf78fb9251d9134580a3dbcea1190ebc6732c" exitCode=0 Apr 19 12:20:27.480782 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.480400 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" event={"ID":"582f660a-326a-427a-9102-b23ffada154c","Type":"ContainerDied","Data":"bf91dbbea1edb87961d6b497481cf78fb9251d9134580a3dbcea1190ebc6732c"} Apr 19 12:20:27.498686 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.498609 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:20:27.498902 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.498880 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7648cb8989-xmjbt" podUID="c8645dd9-d806-4aa1-a434-14fceea686b6" containerName="maas-api" containerID="cri-o://c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a" gracePeriod=30 Apr 19 12:20:27.780049 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.780030 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:20:27.898418 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.898390 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") pod \"c8645dd9-d806-4aa1-a434-14fceea686b6\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " Apr 19 12:20:27.898593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.898428 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnnlv\" (UniqueName: \"kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv\") pod \"c8645dd9-d806-4aa1-a434-14fceea686b6\" (UID: \"c8645dd9-d806-4aa1-a434-14fceea686b6\") " Apr 19 12:20:27.900327 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.900300 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "c8645dd9-d806-4aa1-a434-14fceea686b6" (UID: "c8645dd9-d806-4aa1-a434-14fceea686b6"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:20:27.900424 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.900394 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv" (OuterVolumeSpecName: "kube-api-access-jnnlv") pod "c8645dd9-d806-4aa1-a434-14fceea686b6" (UID: "c8645dd9-d806-4aa1-a434-14fceea686b6"). InnerVolumeSpecName "kube-api-access-jnnlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:20:27.999311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.999281 2568 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c8645dd9-d806-4aa1-a434-14fceea686b6-maas-api-tls\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:20:27.999311 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:27.999307 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jnnlv\" (UniqueName: \"kubernetes.io/projected/c8645dd9-d806-4aa1-a434-14fceea686b6-kube-api-access-jnnlv\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:20:28.486084 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.486040 2568 generic.go:358] "Generic (PLEG): container finished" podID="c8645dd9-d806-4aa1-a434-14fceea686b6" containerID="c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a" exitCode=0 Apr 19 12:20:28.486501 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.486108 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7648cb8989-xmjbt" Apr 19 12:20:28.486501 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.486121 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7648cb8989-xmjbt" event={"ID":"c8645dd9-d806-4aa1-a434-14fceea686b6","Type":"ContainerDied","Data":"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a"} Apr 19 12:20:28.486501 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.486160 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7648cb8989-xmjbt" event={"ID":"c8645dd9-d806-4aa1-a434-14fceea686b6","Type":"ContainerDied","Data":"e47d1ebd861a7701e048d9049d2a96ba78ca73dd31813b6bbb71ac04ff5ffcb5"} Apr 19 12:20:28.486501 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.486180 2568 scope.go:117] "RemoveContainer" containerID="c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a" Apr 19 12:20:28.487976 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.487955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" event={"ID":"582f660a-326a-427a-9102-b23ffada154c","Type":"ContainerStarted","Data":"4669c74c32c9865326b4a5c0ad88c9ac399a8ded0555164981f406a182d2b2cd"} Apr 19 12:20:28.488181 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.488161 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:28.495554 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.495538 2568 scope.go:117] "RemoveContainer" containerID="c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a" Apr 19 12:20:28.495900 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:20:28.495878 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a\": container with ID starting with c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a not found: ID does not exist" containerID="c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a" Apr 19 12:20:28.495981 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.495905 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a"} err="failed to get container status \"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a\": rpc error: code = NotFound desc = could not find container \"c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a\": container with ID starting with c7712ad66b5a13c0eb11521e828eb52bdefc79f7bead2e1c1f24d560778a148a not found: ID does not exist" Apr 19 12:20:28.506809 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.506745 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" podStartSLOduration=8.309914556 podStartE2EDuration="8.506734646s" podCreationTimestamp="2026-04-19 12:20:20 +0000 UTC" firstStartedPulling="2026-04-19 12:20:27.481096419 +0000 UTC m=+646.199920552" lastFinishedPulling="2026-04-19 12:20:27.677916498 +0000 UTC m=+646.396740642" observedRunningTime="2026-04-19 12:20:28.504269196 +0000 UTC m=+647.223093350" watchObservedRunningTime="2026-04-19 12:20:28.506734646 +0000 UTC m=+647.225558813" Apr 19 12:20:28.516675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.516655 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:20:28.519917 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:28.519897 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7648cb8989-xmjbt"] Apr 19 12:20:29.876773 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:29.876736 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8645dd9-d806-4aa1-a434-14fceea686b6" path="/var/lib/kubelet/pods/c8645dd9-d806-4aa1-a434-14fceea686b6/volumes" Apr 19 12:20:37.493989 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:37.493955 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6" Apr 19 12:20:37.494858 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:37.494830 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5" Apr 19 12:20:39.506110 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:39.506081 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-wmwqn" Apr 19 12:20:57.767863 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.767827 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv"] Apr 19 12:20:57.768260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.768213 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8645dd9-d806-4aa1-a434-14fceea686b6" containerName="maas-api" Apr 19 12:20:57.768260 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.768223 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8645dd9-d806-4aa1-a434-14fceea686b6" containerName="maas-api" Apr 19 12:20:57.768336 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.768306 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8645dd9-d806-4aa1-a434-14fceea686b6" containerName="maas-api" Apr 19 12:20:57.771342 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.771326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772408 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772529 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772458 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772530 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd296\" (UniqueName: \"kubernetes.io/projected/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kube-api-access-sd296\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772712 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772698 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.772752 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.772724 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a7220d-f9d5-4840-8c8f-13fedc64c3db-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.773842 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.773829 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 19 12:20:57.781970 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.781946 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv"] Apr 19 12:20:57.873441 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873441 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873471 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd296\" (UniqueName: \"kubernetes.io/projected/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kube-api-access-sd296\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873521 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873699 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a7220d-f9d5-4840-8c8f-13fedc64c3db-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873865 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873920 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.873975 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.873959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.875714 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.875691 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/84a7220d-f9d5-4840-8c8f-13fedc64c3db-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.876027 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.876008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/84a7220d-f9d5-4840-8c8f-13fedc64c3db-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:57.880452 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:57.880432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd296\" (UniqueName: \"kubernetes.io/projected/84a7220d-f9d5-4840-8c8f-13fedc64c3db-kube-api-access-sd296\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv\" (UID: \"84a7220d-f9d5-4840-8c8f-13fedc64c3db\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:58.082441 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:58.082371 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:20:58.212300 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:58.212275 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv"] Apr 19 12:20:58.213509 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:20:58.213489 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a7220d_f9d5_4840_8c8f_13fedc64c3db.slice/crio-d7f5e2278cc2206bef3ce52a6e4f6e923942e6a09749a6b53f7516b7dbb84bd4 WatchSource:0}: Error finding container d7f5e2278cc2206bef3ce52a6e4f6e923942e6a09749a6b53f7516b7dbb84bd4: Status 404 returned error can't find the container with id d7f5e2278cc2206bef3ce52a6e4f6e923942e6a09749a6b53f7516b7dbb84bd4 Apr 19 12:20:58.619882 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:58.619843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" event={"ID":"84a7220d-f9d5-4840-8c8f-13fedc64c3db","Type":"ContainerStarted","Data":"33135be097c5015f88aaeee44d6fad472ff7d5274795343a22886f2af076d72a"} Apr 19 12:20:58.619882 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:20:58.619888 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" event={"ID":"84a7220d-f9d5-4840-8c8f-13fedc64c3db","Type":"ContainerStarted","Data":"d7f5e2278cc2206bef3ce52a6e4f6e923942e6a09749a6b53f7516b7dbb84bd4"} Apr 19 12:21:04.657057 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:04.657022 2568 generic.go:358] "Generic (PLEG): container finished" podID="84a7220d-f9d5-4840-8c8f-13fedc64c3db" containerID="33135be097c5015f88aaeee44d6fad472ff7d5274795343a22886f2af076d72a" exitCode=0 Apr 19 12:21:04.657442 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:04.657097 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" event={"ID":"84a7220d-f9d5-4840-8c8f-13fedc64c3db","Type":"ContainerDied","Data":"33135be097c5015f88aaeee44d6fad472ff7d5274795343a22886f2af076d72a"} Apr 19 12:21:05.662684 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:05.662653 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" event={"ID":"84a7220d-f9d5-4840-8c8f-13fedc64c3db","Type":"ContainerStarted","Data":"453a35dcb677c3b246a977955c8652e888eddf3130139376a632884046151c4f"} Apr 19 12:21:05.663055 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:05.662883 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:21:05.679365 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:05.679320 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" podStartSLOduration=8.518254436 podStartE2EDuration="8.67930845s" podCreationTimestamp="2026-04-19 12:20:57 +0000 UTC" firstStartedPulling="2026-04-19 12:21:04.657723797 +0000 UTC m=+683.376547933" lastFinishedPulling="2026-04-19 12:21:04.818777815 +0000 UTC m=+683.537601947" observedRunningTime="2026-04-19 12:21:05.678802559 +0000 UTC m=+684.397626727" watchObservedRunningTime="2026-04-19 12:21:05.67930845 +0000 UTC m=+684.398132605" Apr 19 12:21:14.670355 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.670305 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2"] Apr 19 12:21:14.675031 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.674998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.677416 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.677388 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 19 12:21:14.682533 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.682508 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2"] Apr 19 12:21:14.729334 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.729489 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.729489 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.729593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pz5\" (UniqueName: \"kubernetes.io/projected/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kube-api-access-d7pz5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.729593 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729561 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.729722 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.729636 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.830862 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.830831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.831151 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.831118 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.831307 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.831290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.831472 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.831447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pz5\" (UniqueName: \"kubernetes.io/projected/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kube-api-access-d7pz5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.831608 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.831592 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.831839 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.831810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.832690 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.832667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.833020 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.832995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.833278 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.833255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.834354 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.834334 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.834762 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.834744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.840206 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.840187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pz5\" (UniqueName: \"kubernetes.io/projected/2a7fb481-5ff3-45ce-9dff-4f91f0425b77-kube-api-access-d7pz5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2\" (UID: \"2a7fb481-5ff3-45ce-9dff-4f91f0425b77\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:14.987071 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:14.987037 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:15.117696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:15.117672 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2"] Apr 19 12:21:15.119364 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:21:15.119336 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7fb481_5ff3_45ce_9dff_4f91f0425b77.slice/crio-041f71eeb6814edccbbe25b500f81c4257bee369b8a9aca8149bc05df8287283 WatchSource:0}: Error finding container 041f71eeb6814edccbbe25b500f81c4257bee369b8a9aca8149bc05df8287283: Status 404 returned error can't find the container with id 041f71eeb6814edccbbe25b500f81c4257bee369b8a9aca8149bc05df8287283 Apr 19 12:21:15.708095 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:15.708055 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" event={"ID":"2a7fb481-5ff3-45ce-9dff-4f91f0425b77","Type":"ContainerStarted","Data":"fd05f1f00db0390cbc4c8b921e1cc87ed0cccb17aba42a788e16bf11cffdee64"} Apr 19 12:21:15.708095 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:15.708095 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" event={"ID":"2a7fb481-5ff3-45ce-9dff-4f91f0425b77","Type":"ContainerStarted","Data":"041f71eeb6814edccbbe25b500f81c4257bee369b8a9aca8149bc05df8287283"} Apr 19 12:21:16.685881 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:16.685840 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv" Apr 19 12:21:23.749147 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.749055 2568 generic.go:358] "Generic (PLEG): container finished" podID="2a7fb481-5ff3-45ce-9dff-4f91f0425b77" containerID="fd05f1f00db0390cbc4c8b921e1cc87ed0cccb17aba42a788e16bf11cffdee64" exitCode=0 Apr 19 12:21:23.749147 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.749098 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" event={"ID":"2a7fb481-5ff3-45ce-9dff-4f91f0425b77","Type":"ContainerDied","Data":"fd05f1f00db0390cbc4c8b921e1cc87ed0cccb17aba42a788e16bf11cffdee64"} Apr 19 12:21:23.800938 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.800897 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp"] Apr 19 12:21:23.804897 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.804881 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.807342 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.807314 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 19 12:21:23.814675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.814654 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp"] Apr 19 12:21:23.920174 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920147 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.920274 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920211 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.920274 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920243 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.920391 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920285 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.920444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920413 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmvh6\" (UniqueName: \"kubernetes.io/projected/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kube-api-access-jmvh6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:23.920486 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:23.920449 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.021870 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.021798 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.021895 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmvh6\" (UniqueName: \"kubernetes.io/projected/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kube-api-access-jmvh6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.021926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.021960 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022013 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.021996 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022227 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.022013 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022383 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.022362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022443 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.022402 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.022480 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.022448 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.023926 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.023906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.024280 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.024262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.029486 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.029458 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmvh6\" (UniqueName: \"kubernetes.io/projected/5a956600-95ba-4723-a8f3-0c10bdd4cbe7-kube-api-access-jmvh6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp\" (UID: \"5a956600-95ba-4723-a8f3-0c10bdd4cbe7\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.116956 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.116920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:24.259057 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.259031 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp"] Apr 19 12:21:24.259655 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:21:24.259613 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a956600_95ba_4723_a8f3_0c10bdd4cbe7.slice/crio-008eecd95803174dc06503cff0c9874c6279e3e7039609cd9c01e0629a92bc11 WatchSource:0}: Error finding container 008eecd95803174dc06503cff0c9874c6279e3e7039609cd9c01e0629a92bc11: Status 404 returned error can't find the container with id 008eecd95803174dc06503cff0c9874c6279e3e7039609cd9c01e0629a92bc11 Apr 19 12:21:24.754867 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.754830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" event={"ID":"2a7fb481-5ff3-45ce-9dff-4f91f0425b77","Type":"ContainerStarted","Data":"0426bb7fcfd112b874884a8b9bd1fa9d098c1c43fcf993fa4fae3fe94d134adc"} Apr 19 12:21:24.755344 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.755055 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:24.756347 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.756323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" event={"ID":"5a956600-95ba-4723-a8f3-0c10bdd4cbe7","Type":"ContainerStarted","Data":"5c758c47630648de7f5e7f02feba4149adeeabc04472e0e61246e81e441bd55a"} Apr 19 12:21:24.756459 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.756353 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" event={"ID":"5a956600-95ba-4723-a8f3-0c10bdd4cbe7","Type":"ContainerStarted","Data":"008eecd95803174dc06503cff0c9874c6279e3e7039609cd9c01e0629a92bc11"} Apr 19 12:21:24.773659 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:24.773598 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" podStartSLOduration=10.615565347 podStartE2EDuration="10.773585182s" podCreationTimestamp="2026-04-19 12:21:14 +0000 UTC" firstStartedPulling="2026-04-19 12:21:23.749748759 +0000 UTC m=+702.468572895" lastFinishedPulling="2026-04-19 12:21:23.907768594 +0000 UTC m=+702.626592730" observedRunningTime="2026-04-19 12:21:24.770226405 +0000 UTC m=+703.489050553" watchObservedRunningTime="2026-04-19 12:21:24.773585182 +0000 UTC m=+703.492409335" Apr 19 12:21:29.777651 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:29.777606 2568 generic.go:358] "Generic (PLEG): container finished" podID="5a956600-95ba-4723-a8f3-0c10bdd4cbe7" containerID="5c758c47630648de7f5e7f02feba4149adeeabc04472e0e61246e81e441bd55a" exitCode=0 Apr 19 12:21:29.778068 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:29.777687 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" event={"ID":"5a956600-95ba-4723-a8f3-0c10bdd4cbe7","Type":"ContainerDied","Data":"5c758c47630648de7f5e7f02feba4149adeeabc04472e0e61246e81e441bd55a"} Apr 19 12:21:30.783711 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:30.783676 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" event={"ID":"5a956600-95ba-4723-a8f3-0c10bdd4cbe7","Type":"ContainerStarted","Data":"cee0c789397ebfdcc7b4cf816b63fab10e95e786061f008779059cc52459a77b"} Apr 19 12:21:30.784165 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:30.783911 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:21:30.803602 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:30.803556 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" podStartSLOduration=7.646909009 podStartE2EDuration="7.803543006s" podCreationTimestamp="2026-04-19 12:21:23 +0000 UTC" firstStartedPulling="2026-04-19 12:21:29.77843151 +0000 UTC m=+708.497255644" lastFinishedPulling="2026-04-19 12:21:29.935065496 +0000 UTC m=+708.653889641" observedRunningTime="2026-04-19 12:21:30.79980923 +0000 UTC m=+709.518633400" watchObservedRunningTime="2026-04-19 12:21:30.803543006 +0000 UTC m=+709.522367160" Apr 19 12:21:35.774019 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:35.773981 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2" Apr 19 12:21:41.801444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:21:41.801412 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp" Apr 19 12:23:08.758088 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:08.758001 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:23:08.758575 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:08.758241 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" podUID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" containerName="manager" containerID="cri-o://0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5" gracePeriod=10 Apr 19 12:23:09.004491 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.004469 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:23:09.106950 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.106871 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l46nk\" (UniqueName: \"kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk\") pod \"f2fae46d-7d31-40b0-baf3-5107a28bb62a\" (UID: \"f2fae46d-7d31-40b0-baf3-5107a28bb62a\") " Apr 19 12:23:09.108883 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.108851 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk" (OuterVolumeSpecName: "kube-api-access-l46nk") pod "f2fae46d-7d31-40b0-baf3-5107a28bb62a" (UID: "f2fae46d-7d31-40b0-baf3-5107a28bb62a"). InnerVolumeSpecName "kube-api-access-l46nk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:23:09.189751 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.189721 2568 generic.go:358] "Generic (PLEG): container finished" podID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" containerID="0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5" exitCode=0 Apr 19 12:23:09.189908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.189787 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" Apr 19 12:23:09.189908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.189809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" event={"ID":"f2fae46d-7d31-40b0-baf3-5107a28bb62a","Type":"ContainerDied","Data":"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5"} Apr 19 12:23:09.189908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.189847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-rxbn6" event={"ID":"f2fae46d-7d31-40b0-baf3-5107a28bb62a","Type":"ContainerDied","Data":"df0068fba45e50aae2f90270aa39a512c6cf4613145077b435024dd4546ae698"} Apr 19 12:23:09.189908 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.189866 2568 scope.go:117] "RemoveContainer" containerID="0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5" Apr 19 12:23:09.199164 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.199147 2568 scope.go:117] "RemoveContainer" containerID="0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5" Apr 19 12:23:09.199399 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:23:09.199379 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5\": container with ID starting with 0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5 not found: ID does not exist" containerID="0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5" Apr 19 12:23:09.199463 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.199412 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5"} err="failed to get container status \"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5\": rpc error: code = NotFound desc = could not find container \"0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5\": container with ID starting with 0ac2d711bab161cb2b04638bb20d4f82d411723defede664f42447534f0fb7c5 not found: ID does not exist" Apr 19 12:23:09.208189 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.208166 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l46nk\" (UniqueName: \"kubernetes.io/projected/f2fae46d-7d31-40b0-baf3-5107a28bb62a-kube-api-access-l46nk\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:23:09.210892 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.210873 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:23:09.222823 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.222797 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-rxbn6"] Apr 19 12:23:09.870195 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:23:09.870158 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" path="/var/lib/kubelet/pods/f2fae46d-7d31-40b0-baf3-5107a28bb62a/volumes" Apr 19 12:33:18.105438 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.105401 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:33:18.107963 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.105647 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" podUID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" containerName="manager" containerID="cri-o://7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7" gracePeriod=10 Apr 19 12:33:18.655486 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.655465 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:33:18.703749 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.703722 2568 generic.go:358] "Generic (PLEG): container finished" podID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" containerID="7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7" exitCode=0 Apr 19 12:33:18.703915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.703803 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" Apr 19 12:33:18.703915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.703807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" event={"ID":"41ae5e67-fd4a-40b2-b192-c1021c3b37bb","Type":"ContainerDied","Data":"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7"} Apr 19 12:33:18.703915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.703852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k" event={"ID":"41ae5e67-fd4a-40b2-b192-c1021c3b37bb","Type":"ContainerDied","Data":"eeaa951fc6d1165fc2886b6148b07d71743f1c5445cf88fa213ba7f67ef8e6d9"} Apr 19 12:33:18.703915 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.703873 2568 scope.go:117] "RemoveContainer" containerID="7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7" Apr 19 12:33:18.715060 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.715044 2568 scope.go:117] "RemoveContainer" containerID="7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7" Apr 19 12:33:18.715326 ip-10-0-140-225 kubenswrapper[2568]: E0419 12:33:18.715309 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7\": container with ID starting with 7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7 not found: ID does not exist" containerID="7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7" Apr 19 12:33:18.715378 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.715334 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7"} err="failed to get container status \"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7\": rpc error: code = NotFound desc = could not find container \"7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7\": container with ID starting with 7b6f4da41ce917960bec2765e72556f16576ccd1d1305466ebc802c5869f95b7 not found: ID does not exist" Apr 19 12:33:18.768342 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.768313 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume\") pod \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " Apr 19 12:33:18.768470 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.768437 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klskv\" (UniqueName: \"kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv\") pod \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\" (UID: \"41ae5e67-fd4a-40b2-b192-c1021c3b37bb\") " Apr 19 12:33:18.768665 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.768619 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "41ae5e67-fd4a-40b2-b192-c1021c3b37bb" (UID: "41ae5e67-fd4a-40b2-b192-c1021c3b37bb"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 19 12:33:18.770332 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.770311 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv" (OuterVolumeSpecName: "kube-api-access-klskv") pod "41ae5e67-fd4a-40b2-b192-c1021c3b37bb" (UID: "41ae5e67-fd4a-40b2-b192-c1021c3b37bb"). InnerVolumeSpecName "kube-api-access-klskv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:33:18.869713 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.869683 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klskv\" (UniqueName: \"kubernetes.io/projected/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-kube-api-access-klskv\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:33:18.869713 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:18.869709 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41ae5e67-fd4a-40b2-b192-c1021c3b37bb-extensions-socket-volume\") on node \"ip-10-0-140-225.ec2.internal\" DevicePath \"\"" Apr 19 12:33:19.026604 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:19.026573 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:33:19.031012 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:19.030991 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zh97k"] Apr 19 12:33:19.869367 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:33:19.869338 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" path="/var/lib/kubelet/pods/41ae5e67-fd4a-40b2-b192-c1021c3b37bb/volumes" Apr 19 12:34:24.137266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137230 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc"] Apr 19 12:34:24.137743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137638 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" containerName="manager" Apr 19 12:34:24.137743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137651 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" containerName="manager" Apr 19 12:34:24.137743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137666 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" containerName="manager" Apr 19 12:34:24.137743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137672 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" containerName="manager" Apr 19 12:34:24.137743 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137744 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="41ae5e67-fd4a-40b2-b192-c1021c3b37bb" containerName="manager" Apr 19 12:34:24.137905 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.137755 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2fae46d-7d31-40b0-baf3-5107a28bb62a" containerName="manager" Apr 19 12:34:24.140736 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.140719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.143696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.143648 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-g6tzz\"" Apr 19 12:34:24.143696 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.143688 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:34:24.143887 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.143709 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:34:24.152082 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.152061 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc"] Apr 19 12:34:24.246105 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.246073 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stxcq\" (UniqueName: \"kubernetes.io/projected/48631414-8290-4c5b-8bb1-1796dcc9e506-kube-api-access-stxcq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.246264 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.246110 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/48631414-8290-4c5b-8bb1-1796dcc9e506-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.346854 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.346817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stxcq\" (UniqueName: \"kubernetes.io/projected/48631414-8290-4c5b-8bb1-1796dcc9e506-kube-api-access-stxcq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.347024 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.346861 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/48631414-8290-4c5b-8bb1-1796dcc9e506-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.347272 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.347250 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/48631414-8290-4c5b-8bb1-1796dcc9e506-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.354675 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.354654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stxcq\" (UniqueName: \"kubernetes.io/projected/48631414-8290-4c5b-8bb1-1796dcc9e506-kube-api-access-stxcq\") pod \"kuadrant-operator-controller-manager-55c7f4c975-58lsc\" (UID: \"48631414-8290-4c5b-8bb1-1796dcc9e506\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.451756 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.451726 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.595079 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.595047 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc"] Apr 19 12:34:24.595883 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:34:24.595848 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48631414_8290_4c5b_8bb1_1796dcc9e506.slice/crio-1c346a753f4b6b9f9885f83b49ae714a84abea6fc47382851030d686df2b8242 WatchSource:0}: Error finding container 1c346a753f4b6b9f9885f83b49ae714a84abea6fc47382851030d686df2b8242: Status 404 returned error can't find the container with id 1c346a753f4b6b9f9885f83b49ae714a84abea6fc47382851030d686df2b8242 Apr 19 12:34:24.598243 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.598225 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:34:24.967069 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.967038 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" event={"ID":"48631414-8290-4c5b-8bb1-1796dcc9e506","Type":"ContainerStarted","Data":"1370065fa28d3a6321e4cba6d3f7a73c653f3044123e35867751903fd321710d"} Apr 19 12:34:24.967069 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.967074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" event={"ID":"48631414-8290-4c5b-8bb1-1796dcc9e506","Type":"ContainerStarted","Data":"1c346a753f4b6b9f9885f83b49ae714a84abea6fc47382851030d686df2b8242"} Apr 19 12:34:24.967410 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.967103 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:34:24.989695 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:24.989655 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" podStartSLOduration=0.989639494 podStartE2EDuration="989.639494ms" podCreationTimestamp="2026-04-19 12:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:34:24.986868084 +0000 UTC m=+1483.705692238" watchObservedRunningTime="2026-04-19 12:34:24.989639494 +0000 UTC m=+1483.708463639" Apr 19 12:34:35.977436 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:34:35.977406 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-58lsc" Apr 19 12:44:07.789497 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:07.789410 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mxx94_33648a89-a48d-4908-9aa7-b933d1e02e8c/manager/0.log" Apr 19 12:44:07.895547 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:07.895517 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-d69f7d5bf-c5mss_46397433-f672-4827-bf15-5b131629f949/maas-api/0.log" Apr 19 12:44:08.339803 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:08.339774 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-hhwxn_bf8bd34d-5a0d-40e3-b837-24ba4ae7e215/manager/0.log" Apr 19 12:44:08.559803 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:08.559759 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-8x6qm_5577bc4c-fe2e-4f1f-857f-c5fa86753ea5/postgres/0.log" Apr 19 12:44:09.263990 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.263964 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/util/0.log" Apr 19 12:44:09.269592 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.269567 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/pull/0.log" Apr 19 12:44:09.274769 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.274746 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/extract/0.log" Apr 19 12:44:09.377109 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.377079 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/util/0.log" Apr 19 12:44:09.382245 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.382221 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/pull/0.log" Apr 19 12:44:09.387615 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.387596 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/extract/0.log" Apr 19 12:44:09.492107 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.492082 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/pull/0.log" Apr 19 12:44:09.497921 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.497901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/extract/0.log" Apr 19 12:44:09.503008 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.502987 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/util/0.log" Apr 19 12:44:09.606214 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.606156 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/util/0.log" Apr 19 12:44:09.611755 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.611736 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/pull/0.log" Apr 19 12:44:09.616891 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:09.616875 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/extract/0.log" Apr 19 12:44:10.257091 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:10.257061 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-58lsc_48631414-8290-4c5b-8bb1-1796dcc9e506/manager/0.log" Apr 19 12:44:10.874708 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:10.874678 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8tpgs_93664af5-2722-48d4-b948-7a32e4d3c11e/discovery/0.log" Apr 19 12:44:10.974092 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:10.974054 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-58497579d8-6g4hh_e633d622-52e6-40a4-aef0-84f7a013542b/kube-auth-proxy/0.log" Apr 19 12:44:11.287805 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.287768 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57d6b784d-fjqwn_62ce91ba-1d6b-4aae-9fc2-6ef69f90a963/router/0.log" Apr 19 12:44:11.599304 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.599209 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp_5a956600-95ba-4723-a8f3-0c10bdd4cbe7/storage-initializer/0.log" Apr 19 12:44:11.605453 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.605421 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-7f7fp_5a956600-95ba-4723-a8f3-0c10bdd4cbe7/main/0.log" Apr 19 12:44:11.709657 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.709616 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5_0df244f0-d09b-4e97-9d48-46e4ba43877f/storage-initializer/0.log" Apr 19 12:44:11.716259 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.716228 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-8q4j5_0df244f0-d09b-4e97-9d48-46e4ba43877f/main/0.log" Apr 19 12:44:11.817326 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.817278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wmwqn_582f660a-326a-427a-9102-b23ffada154c/storage-initializer/0.log" Apr 19 12:44:11.823123 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.823104 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-wmwqn_582f660a-326a-427a-9102-b23ffada154c/main/0.log" Apr 19 12:44:11.926700 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.926614 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv_84a7220d-f9d5-4840-8c8f-13fedc64c3db/storage-initializer/0.log" Apr 19 12:44:11.935112 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:11.935089 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc9qsdv_84a7220d-f9d5-4840-8c8f-13fedc64c3db/main/0.log" Apr 19 12:44:12.041914 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:12.041882 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6_62e1fbc8-9352-4ede-847f-5f3523ddd9d7/storage-initializer/0.log" Apr 19 12:44:12.047978 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:12.047944 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-lvpq6_62e1fbc8-9352-4ede-847f-5f3523ddd9d7/main/0.log" Apr 19 12:44:12.152783 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:12.152757 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2_2a7fb481-5ff3-45ce-9dff-4f91f0425b77/storage-initializer/0.log" Apr 19 12:44:12.158832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:12.158812 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-xpwz2_2a7fb481-5ff3-45ce-9dff-4f91f0425b77/main/0.log" Apr 19 12:44:18.613242 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:18.613213 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jlktl_64f68a15-86a3-4526-a8c3-2d66d94b763f/global-pull-secret-syncer/0.log" Apr 19 12:44:18.757779 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:18.757751 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8gk7l_fefc3e1e-c62c-487a-a8d5-2c56a47bb505/konnectivity-agent/0.log" Apr 19 12:44:18.850824 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:18.850786 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-225.ec2.internal_e49faeb3a3d52b4f2890ad643fe20f27/haproxy/0.log" Apr 19 12:44:22.774017 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.773987 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/extract/0.log" Apr 19 12:44:22.791760 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.791733 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/util/0.log" Apr 19 12:44:22.812773 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.812749 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759hwjp9_02298eb0-c2a6-414b-ac9a-3880f185b1f9/pull/0.log" Apr 19 12:44:22.838995 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.838969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/extract/0.log" Apr 19 12:44:22.859572 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.859553 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/util/0.log" Apr 19 12:44:22.881396 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.881375 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0mgttf_6a1a0fbb-ddda-40c3-9d18-247790359cfa/pull/0.log" Apr 19 12:44:22.908369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.908347 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/extract/0.log" Apr 19 12:44:22.927511 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.927478 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/util/0.log" Apr 19 12:44:22.947194 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.947175 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73smbgb_1974cf29-120a-4b71-9c55-8896b6d353a9/pull/0.log" Apr 19 12:44:22.972346 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.972330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/extract/0.log" Apr 19 12:44:22.992387 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:22.992370 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/util/0.log" Apr 19 12:44:23.013488 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:23.013466 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef14d9d7_d1a4d9ef-7961-4947-a625-743fc70d81ca/pull/0.log" Apr 19 12:44:23.475615 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:23.475583 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-58lsc_48631414-8290-4c5b-8bb1-1796dcc9e506/manager/0.log" Apr 19 12:44:24.943769 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:24.943742 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/alertmanager/0.log" Apr 19 12:44:24.966636 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:24.966599 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/config-reloader/0.log" Apr 19 12:44:24.988325 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:24.988304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/kube-rbac-proxy-web/0.log" Apr 19 12:44:25.011140 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.011115 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/kube-rbac-proxy/0.log" Apr 19 12:44:25.031145 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.031116 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/kube-rbac-proxy-metric/0.log" Apr 19 12:44:25.050820 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.050799 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/prom-label-proxy/0.log" Apr 19 12:44:25.073831 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.073808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_31684d9c-023e-4de7-a35c-c02cfb7c0b4f/init-config-reloader/0.log" Apr 19 12:44:25.315046 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.315018 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8gp6_bea51828-5053-4026-a29e-73d8cc734dcd/node-exporter/0.log" Apr 19 12:44:25.333991 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.333970 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8gp6_bea51828-5053-4026-a29e-73d8cc734dcd/kube-rbac-proxy/0.log" Apr 19 12:44:25.351926 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.351909 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8gp6_bea51828-5053-4026-a29e-73d8cc734dcd/init-textfile/0.log" Apr 19 12:44:25.441437 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.441409 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tq6hj_5725e8df-66f1-4e22-bfed-e6467ea2c2a6/kube-rbac-proxy-main/0.log" Apr 19 12:44:25.459548 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.459524 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tq6hj_5725e8df-66f1-4e22-bfed-e6467ea2c2a6/kube-rbac-proxy-self/0.log" Apr 19 12:44:25.477903 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.477885 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-tq6hj_5725e8df-66f1-4e22-bfed-e6467ea2c2a6/openshift-state-metrics/0.log" Apr 19 12:44:25.651266 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.651192 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kckkr_d391a91d-cb33-4e29-a391-e8fd3f91c810/prometheus-operator/0.log" Apr 19 12:44:25.669999 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.669969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-kckkr_d391a91d-cb33-4e29-a391-e8fd3f91c810/kube-rbac-proxy/0.log" Apr 19 12:44:25.691904 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.691885 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9rcpd_5349ed6d-296f-49e9-8001-1cb71a7a2a71/prometheus-operator-admission-webhook/0.log" Apr 19 12:44:25.719951 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.719930 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-557f7b95bf-wpn6m_fb0976d2-6114-4151-ad73-08b849a869aa/telemeter-client/0.log" Apr 19 12:44:25.738444 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.738425 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-557f7b95bf-wpn6m_fb0976d2-6114-4151-ad73-08b849a869aa/reload/0.log" Apr 19 12:44:25.757095 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:25.757077 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-557f7b95bf-wpn6m_fb0976d2-6114-4151-ad73-08b849a869aa/kube-rbac-proxy/0.log" Apr 19 12:44:27.196663 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.196614 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc"] Apr 19 12:44:27.200369 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.200352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.203074 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.203043 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxrl8\"/\"default-dockercfg-qptrb\"" Apr 19 12:44:27.203216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.203074 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"kube-root-ca.crt\"" Apr 19 12:44:27.203216 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.203043 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"openshift-service-ca.crt\"" Apr 19 12:44:27.205330 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.205308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc"] Apr 19 12:44:27.273155 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.273116 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbsdt\" (UniqueName: \"kubernetes.io/projected/3ffb7719-1812-42bd-b622-c1914a8e6469-kube-api-access-mbsdt\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.273155 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.273159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-proc\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.273364 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.273181 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-sys\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.273364 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.273282 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-lib-modules\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.273364 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.273316 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-podres\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbsdt\" (UniqueName: \"kubernetes.io/projected/3ffb7719-1812-42bd-b622-c1914a8e6469-kube-api-access-mbsdt\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374175 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-proc\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-sys\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-lib-modules\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374257 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-podres\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374279 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-proc\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374402 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374339 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-sys\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-podres\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.374569 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.374414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ffb7719-1812-42bd-b622-c1914a8e6469-lib-modules\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.381345 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.381324 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbsdt\" (UniqueName: \"kubernetes.io/projected/3ffb7719-1812-42bd-b622-c1914a8e6469-kube-api-access-mbsdt\") pod \"perf-node-gather-daemonset-c4bdc\" (UID: \"3ffb7719-1812-42bd-b622-c1914a8e6469\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.512227 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.512143 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:27.640323 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.640300 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc"] Apr 19 12:44:27.642240 ip-10-0-140-225 kubenswrapper[2568]: W0419 12:44:27.642201 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3ffb7719_1812_42bd_b622_c1914a8e6469.slice/crio-79fda64934cb123d6942547d504eb890d0334bde9f152f3ad2e3b27c2653a45a WatchSource:0}: Error finding container 79fda64934cb123d6942547d504eb890d0334bde9f152f3ad2e3b27c2653a45a: Status 404 returned error can't find the container with id 79fda64934cb123d6942547d504eb890d0334bde9f152f3ad2e3b27c2653a45a Apr 19 12:44:27.644182 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.644163 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:44:27.789537 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:27.789485 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64db9444b9-rlpzp_21388144-4fec-41b8-bea8-923e7a8c17ab/console/0.log" Apr 19 12:44:28.235115 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:28.235083 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pqspr_14272c98-366d-4c1a-a78d-04018f274961/volume-data-source-validator/0.log" Apr 19 12:44:28.461231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:28.461201 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" event={"ID":"3ffb7719-1812-42bd-b622-c1914a8e6469","Type":"ContainerStarted","Data":"e669f0109e39778a989053d61566f16bc262503d12036b6eb1f35a367270a5a2"} Apr 19 12:44:28.461231 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:28.461234 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" event={"ID":"3ffb7719-1812-42bd-b622-c1914a8e6469","Type":"ContainerStarted","Data":"79fda64934cb123d6942547d504eb890d0334bde9f152f3ad2e3b27c2653a45a"} Apr 19 12:44:28.461449 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:28.461296 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:28.476548 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:28.476503 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" podStartSLOduration=1.476489618 podStartE2EDuration="1.476489618s" podCreationTimestamp="2026-04-19 12:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:44:28.474487501 +0000 UTC m=+2087.193311657" watchObservedRunningTime="2026-04-19 12:44:28.476489618 +0000 UTC m=+2087.195313811" Apr 19 12:44:29.100669 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:29.100638 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kzhlq_9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2/dns/0.log" Apr 19 12:44:29.118363 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:29.118341 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kzhlq_9b5e2955-23ce-47dd-b5ce-f68dd7cd9be2/kube-rbac-proxy/0.log" Apr 19 12:44:29.157658 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:29.157615 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-h8wxf_9920f4d8-e6b0-4993-baa5-e254915bebae/dns-node-resolver/0.log" Apr 19 12:44:29.642830 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:29.642802 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-t4c4r_4e45a4de-472d-4b7e-addc-01dfec69c9d8/node-ca/0.log" Apr 19 12:44:30.487850 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:30.487816 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8tpgs_93664af5-2722-48d4-b948-7a32e4d3c11e/discovery/0.log" Apr 19 12:44:30.505998 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:30.505976 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-58497579d8-6g4hh_e633d622-52e6-40a4-aef0-84f7a013542b/kube-auth-proxy/0.log" Apr 19 12:44:30.602568 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:30.602543 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-57d6b784d-fjqwn_62ce91ba-1d6b-4aae-9fc2-6ef69f90a963/router/0.log" Apr 19 12:44:31.087756 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:31.087726 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9xxkb_15abc5b3-a4e0-41a2-b57d-ee187b37cd52/serve-healthcheck-canary/0.log" Apr 19 12:44:31.674168 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:31.674139 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tth9c_495fbf4b-c9eb-42e3-8537-ae405b10f613/kube-rbac-proxy/0.log" Apr 19 12:44:31.693768 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:31.693740 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tth9c_495fbf4b-c9eb-42e3-8537-ae405b10f613/exporter/0.log" Apr 19 12:44:31.713504 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:31.713479 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tth9c_495fbf4b-c9eb-42e3-8537-ae405b10f613/extractor/0.log" Apr 19 12:44:33.486470 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:33.486442 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mxx94_33648a89-a48d-4908-9aa7-b933d1e02e8c/manager/0.log" Apr 19 12:44:33.514832 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:33.514808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-d69f7d5bf-c5mss_46397433-f672-4827-bf15-5b131629f949/maas-api/0.log" Apr 19 12:44:33.714355 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:33.714329 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-hhwxn_bf8bd34d-5a0d-40e3-b837-24ba4ae7e215/manager/0.log" Apr 19 12:44:33.762126 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:33.762053 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-8x6qm_5577bc4c-fe2e-4f1f-857f-c5fa86753ea5/postgres/0.log" Apr 19 12:44:34.475734 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:34.475709 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-c4bdc" Apr 19 12:44:34.857380 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:34.857310 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-pzdhj_23b1fea5-f79c-4b92-af32-5017b5e92514/openshift-lws-operator/0.log" Apr 19 12:44:40.809738 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.809709 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/kube-multus-additional-cni-plugins/0.log" Apr 19 12:44:40.832041 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.832010 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/egress-router-binary-copy/0.log" Apr 19 12:44:40.851640 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.851609 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/cni-plugins/0.log" Apr 19 12:44:40.870660 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.870645 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/bond-cni-plugin/0.log" Apr 19 12:44:40.889709 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.889692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/routeoverride-cni/0.log" Apr 19 12:44:40.907469 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.907441 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/whereabouts-cni-bincopy/0.log" Apr 19 12:44:40.926448 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:40.926424 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rqgk2_688acc9f-4a93-448c-a106-915356989bff/whereabouts-cni/0.log" Apr 19 12:44:41.136450 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:41.136383 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jxzs6_f35f6e45-1295-40e6-a620-e3a0f9a2dd05/kube-multus/0.log" Apr 19 12:44:41.196882 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:41.196857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vkmz_29686a24-b6da-4655-8af2-679ab3a6bbbf/network-metrics-daemon/0.log" Apr 19 12:44:41.213737 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:41.213702 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vkmz_29686a24-b6da-4655-8af2-679ab3a6bbbf/kube-rbac-proxy/0.log" Apr 19 12:44:42.288800 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.288771 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/ovn-controller/0.log" Apr 19 12:44:42.316640 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.316594 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/ovn-acl-logging/0.log" Apr 19 12:44:42.336372 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.336350 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/kube-rbac-proxy-node/0.log" Apr 19 12:44:42.355927 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.355901 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 12:44:42.372190 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.372167 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/northd/0.log" Apr 19 12:44:42.390479 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.390462 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/nbdb/0.log" Apr 19 12:44:42.409190 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.409170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/sbdb/0.log" Apr 19 12:44:42.513411 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:42.513385 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7t4b4_674429c5-1701-4b79-a719-7de71b17fc9c/ovnkube-controller/0.log" Apr 19 12:44:43.819655 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:43.819594 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-8dx5p_46d7fd4e-5950-4a4e-b4bb-712e0db4a633/check-endpoints/0.log" Apr 19 12:44:43.885845 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:43.885809 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xmmjm_571bc17e-6675-462f-9093-2c3531edf595/network-check-target-container/0.log" Apr 19 12:44:44.889856 ip-10-0-140-225 kubenswrapper[2568]: I0419 12:44:44.889831 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s2mg7_92f813a8-85b5-4872-b0d8-1b7ab33b4e86/iptables-alerter/0.log"