Apr 16 16:23:16.040425 ip-10-0-138-125 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:23:16.510008 ip-10-0-138-125 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:16.510008 ip-10-0-138-125 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:23:16.510008 ip-10-0-138-125 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:16.510008 ip-10-0-138-125 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:23:16.510008 ip-10-0-138-125 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:23:16.512097 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.511976 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:23:16.517382 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517362 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:16.517382 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517381 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:16.517382 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517386 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517389 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517392 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517396 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517400 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517403 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517405 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517408 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517412 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517415 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517418 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517421 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517423 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517426 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517429 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517432 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517435 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517438 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517440 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:16.517483 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517445 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517449 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517452 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517454 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517457 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517460 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517463 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517465 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517469 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517471 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517474 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517476 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517479 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517482 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517484 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517487 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517491 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517495 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517498 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517500 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:16.517952 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517503 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517506 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517508 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517511 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517513 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517516 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517518 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517521 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517524 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517526 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517529 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517531 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517534 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517536 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517540 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517543 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517546 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517548 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517551 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517553 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:16.518449 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517556 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517559 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517561 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517565 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517569 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517576 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517580 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517583 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517585 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517588 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517590 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517593 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517595 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517598 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517601 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517603 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517605 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517608 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517610 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:16.518939 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517613 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517615 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517618 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517621 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517624 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.517627 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518088 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518094 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518098 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518100 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518104 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518107 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518110 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518127 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518130 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518133 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518136 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518138 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518147 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:16.519416 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518150 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518152 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518155 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518157 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518160 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518163 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518165 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518168 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518171 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518174 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518176 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518179 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518181 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518184 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518187 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518189 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518192 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518195 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518199 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518203 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:16.519897 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518206 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518209 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518212 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518214 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518217 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518220 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518222 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518225 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518228 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518230 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518234 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518236 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518242 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518245 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518248 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518250 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518253 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518255 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518258 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518260 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:16.520417 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518263 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518265 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518268 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518270 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518273 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518275 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518278 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518280 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518283 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518285 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518288 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518291 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518294 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518297 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518300 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518302 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518305 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518309 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518313 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:16.521000 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518317 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518319 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518322 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518325 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518327 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518330 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518333 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518335 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518337 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518340 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518342 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518345 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518347 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.518350 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519146 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519156 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519167 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519173 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519177 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519181 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519186 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:23:16.521498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519191 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519194 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519197 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519200 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519205 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519208 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519212 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519215 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519218 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519221 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519224 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519227 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519232 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519235 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519238 2571 flags.go:64] FLAG: --config-dir="" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519241 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519245 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519249 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519253 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519256 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519260 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519263 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519266 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519269 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519272 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:23:16.522030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519276 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519281 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519284 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519287 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519290 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519293 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519297 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519301 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519305 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519308 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519311 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519314 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519318 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519321 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519324 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519328 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519331 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519334 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519337 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519340 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519343 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519346 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519349 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519353 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519356 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:23:16.522650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519360 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519364 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519367 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519370 2571 flags.go:64] FLAG: --help="false" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519373 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519377 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519379 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519382 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519386 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519389 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519392 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519395 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519398 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519401 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519405 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519408 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519411 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519414 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519417 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519421 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519424 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519427 2571 flags.go:64] FLAG: --lock-file="" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519430 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519433 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:23:16.523273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519437 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519443 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519446 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519449 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519452 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519455 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519458 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519461 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519464 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519469 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519473 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519477 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519480 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519483 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519487 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519490 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519494 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519497 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519500 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519507 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519510 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519513 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519517 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:23:16.523853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519520 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519527 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519530 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519533 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519536 2571 flags.go:64] FLAG: --port="10250" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519539 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519542 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e9719d2a3ac799f6" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519546 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519549 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519552 2571 flags.go:64] FLAG: --register-node="true" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519555 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519558 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519562 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519564 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519567 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519570 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519574 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519577 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519580 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519583 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519587 2571 flags.go:64] FLAG: --runonce="false" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519590 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519593 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519597 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519600 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519602 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:23:16.524443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519606 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519609 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519612 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519615 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519618 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519621 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519624 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519627 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519630 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519633 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519639 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519642 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519644 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519649 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519652 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519655 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519657 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519661 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519664 2571 flags.go:64] FLAG: --v="2" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519668 2571 flags.go:64] FLAG: --version="false" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519672 2571 flags.go:64] FLAG: --vmodule="" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519677 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.519680 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519783 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:16.525150 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519786 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519790 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519793 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519797 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519800 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519803 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519806 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519809 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519812 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519815 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519817 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519820 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519822 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519825 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519828 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519831 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519834 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519836 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519839 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:16.525763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519842 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519844 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519847 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519849 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519852 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519854 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519857 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519859 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519862 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519865 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519867 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519870 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519872 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519874 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519878 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519882 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519885 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519888 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519891 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519894 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:16.526259 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519897 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519899 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519902 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519905 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519907 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519910 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519913 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519916 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519918 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519921 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519923 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519926 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519928 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519931 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519933 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519936 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519938 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519941 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519944 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:16.526763 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519948 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519951 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519954 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519957 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519959 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519962 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519965 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519968 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519971 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519974 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519976 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519979 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519982 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519984 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519987 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519989 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519992 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519994 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.519997 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520000 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:16.527266 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520002 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520005 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520008 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520010 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520013 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520015 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.520018 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.520582 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.527241 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.527260 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527308 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527314 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527317 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527321 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527324 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527326 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:16.527756 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527329 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527332 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527335 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527338 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527342 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527344 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527347 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527350 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527352 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527355 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527358 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527361 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527364 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527366 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527369 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527372 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527375 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527377 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527380 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527382 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:16.528221 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527385 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527387 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527390 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527393 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527396 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527400 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527403 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527406 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527409 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527412 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527414 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527417 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527419 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527422 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527425 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527427 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527429 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527432 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527435 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527438 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:16.528714 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527440 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527443 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527446 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527450 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527453 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527457 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527459 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527462 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527464 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527468 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527471 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527474 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527476 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527479 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527482 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527487 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527490 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527493 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527496 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:16.529242 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527499 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527502 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527505 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527507 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527510 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527513 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527515 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527518 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527520 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527523 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527526 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527529 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527531 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527534 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527537 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527540 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527542 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527545 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527547 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527550 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:16.529717 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527552 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.527558 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527663 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527669 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527672 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527675 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527678 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527681 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527684 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527687 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527690 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527692 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527696 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527698 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527701 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:23:16.530241 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527703 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527706 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527709 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527712 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527714 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527717 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527719 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527722 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527725 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527727 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527730 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527733 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527736 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527738 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527741 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527743 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527746 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527748 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527751 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527754 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:23:16.530613 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527757 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527759 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527762 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527765 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527768 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527770 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527773 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527776 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527778 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527781 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527789 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527791 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527794 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527796 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527800 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527804 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527807 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527810 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527812 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527815 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:23:16.531096 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527821 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527823 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527826 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527829 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527831 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527835 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527839 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527842 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527845 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527847 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527850 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527852 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527855 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527858 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527860 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527862 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527865 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527868 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527871 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:23:16.531615 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527874 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527876 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527879 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527881 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527884 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527887 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527889 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527892 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527894 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527897 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527899 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527902 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527904 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:16.527907 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.527912 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:23:16.532080 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.528046 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:23:16.532549 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.530087 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:23:16.532549 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.531468 2571 server.go:1019] "Starting client certificate rotation" Apr 16 16:23:16.532549 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.531562 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:23:16.532549 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.532364 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:23:16.559724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.559698 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:23:16.565589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.565565 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:23:16.580899 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.580872 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:23:16.587559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.587538 2571 log.go:25] "Validated CRI v1 image API" Apr 16 16:23:16.588827 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.588809 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:23:16.592459 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.592424 2571 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 cc2476b8-6a44-4aea-9c70-870d93577030:/dev/nvme0n1p4 dd37af86-8176-469b-9454-550259270046:/dev/nvme0n1p3] Apr 16 16:23:16.592562 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.592456 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:23:16.595082 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.595059 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:23:16.597597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.597467 2571 manager.go:217] Machine: {Timestamp:2026-04-16 16:23:16.596267421 +0000 UTC m=+0.431120316 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099674 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a27baa7131a7f8c54bda5b6b407aa SystemUUID:ec2a27ba-a713-1a7f-8c54-bda5b6b407aa BootID:682e260c-8787-4216-8cb8-ebbe8e7624c3 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:47:85:48:fc:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:47:85:48:fc:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:76:84:6f:ea:64 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:23:16.597597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.597583 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:23:16.597775 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.597704 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:23:16.600096 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600061 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:23:16.600299 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600098 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-125.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:23:16.600395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600319 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:23:16.600395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600332 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:23:16.600395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600356 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:23:16.600395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.600376 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:23:16.602108 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.602094 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:23:16.602256 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.602245 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:23:16.605993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.605981 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:23:16.606047 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.606001 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:23:16.606808 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.606796 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:23:16.606864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.606818 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:23:16.606864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.606833 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:23:16.608425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.608411 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:23:16.608501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.608434 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:23:16.613239 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.613202 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:23:16.615415 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.615397 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:23:16.616730 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.616712 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9xr5s" Apr 16 16:23:16.617196 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617180 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617203 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617209 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617215 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617221 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617227 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617233 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617238 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:23:16.617245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617245 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:23:16.617445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617251 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:23:16.617445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617260 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:23:16.617445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.617270 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:23:16.618215 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.618202 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:23:16.618263 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.618218 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:23:16.618758 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.618726 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:23:16.618832 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.618808 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-125.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:23:16.620488 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.620471 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-125.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:23:16.621887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.621875 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:23:16.621944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.621912 2571 server.go:1295] "Started kubelet" Apr 16 16:23:16.622013 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.621991 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:23:16.622064 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.622016 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:23:16.622099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.622079 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:23:16.622685 ip-10-0-138-125 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:23:16.623304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.623289 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:23:16.624464 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.624446 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9xr5s" Apr 16 16:23:16.624559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.624547 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:23:16.630072 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.630052 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:23:16.630554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.630539 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:23:16.632507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632367 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:23:16.632507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632389 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:23:16.632507 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.632395 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:16.632507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632506 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:23:16.632766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632514 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:23:16.632836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632817 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:23:16.632884 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632841 2571 factory.go:55] Registering systemd factory Apr 16 16:23:16.632884 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632850 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:23:16.632975 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.632901 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:23:16.633168 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.633150 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:16.633941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.633782 2571 factory.go:153] Registering CRI-O factory Apr 16 16:23:16.633941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.633798 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 16:23:16.633941 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.633796 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:23:16.633941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.633829 2571 factory.go:103] Registering Raw factory Apr 16 16:23:16.633941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.633844 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 16:23:16.634616 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.634599 2571 manager.go:319] Starting recovery of all containers Apr 16 16:23:16.638269 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.638246 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-125.ec2.internal\" not found" node="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.645450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.645435 2571 manager.go:324] Recovery completed Apr 16 16:23:16.649525 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.649513 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.652023 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.651913 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.652098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.652036 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.652098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.652047 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.652557 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.652541 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:23:16.652617 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.652556 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:23:16.652617 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.652577 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:23:16.655149 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.655138 2571 policy_none.go:49] "None policy: Start" Apr 16 16:23:16.655184 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.655154 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:23:16.655184 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.655164 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:23:16.700104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.699800 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 16:23:16.700104 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.699834 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:23:16.700104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.699845 2571 server.go:85] "Starting device plugin registration server" Apr 16 16:23:16.700293 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.700151 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:23:16.700293 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.700162 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:23:16.700293 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.700283 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:23:16.700432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.700364 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:23:16.700432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.700373 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:23:16.700850 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.700827 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:23:16.700941 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.700875 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:16.748932 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.748878 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:23:16.750169 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.750152 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:23:16.750245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.750190 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:23:16.750245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.750212 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:23:16.750245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.750218 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:23:16.750354 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.750252 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:23:16.754669 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.754645 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:16.800788 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.800716 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.801699 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.801678 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.801807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.801717 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.801807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.801731 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.801807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.801761 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.813299 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.813279 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.813384 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.813305 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-125.ec2.internal\": node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:16.841032 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.841007 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:16.851075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.851047 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal"] Apr 16 16:23:16.851164 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.851154 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.853065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.853048 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.853163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.853076 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.853163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.853090 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.854383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.854370 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.854524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.854510 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.854571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.854538 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.855642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855624 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.855729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855629 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.855729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855675 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.855729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855690 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.855729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855651 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.855729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.855730 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.857568 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.857550 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.857620 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.857588 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:23:16.858323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.858307 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:23:16.858400 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.858333 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:23:16.858400 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:16.858344 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:23:16.873738 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.873717 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-125.ec2.internal\" not found" node="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.877618 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.877600 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-125.ec2.internal\" not found" node="ip-10-0-138-125.ec2.internal" Apr 16 16:23:16.942105 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:16.942076 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.034939 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.034907 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3daeaf9dae8edeea4bbaed1ffe567636-config\") pod \"kube-apiserver-proxy-ip-10-0-138-125.ec2.internal\" (UID: \"3daeaf9dae8edeea4bbaed1ffe567636\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.035060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.034945 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.035060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.034970 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.043010 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.042982 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.136144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3daeaf9dae8edeea4bbaed1ffe567636-config\") pod \"kube-apiserver-proxy-ip-10-0-138-125.ec2.internal\" (UID: \"3daeaf9dae8edeea4bbaed1ffe567636\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.136144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.136144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.136342 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136170 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3daeaf9dae8edeea4bbaed1ffe567636-config\") pod \"kube-apiserver-proxy-ip-10-0-138-125.ec2.internal\" (UID: \"3daeaf9dae8edeea4bbaed1ffe567636\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.136342 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.136342 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.136173 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3608a57f147686cc88eda8f9fac573e3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal\" (UID: \"3608a57f147686cc88eda8f9fac573e3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.143125 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.143087 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.177301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.177274 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.180804 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.180788 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.243523 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.243482 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.344128 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.344096 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.444795 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.444720 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.531058 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.531034 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:23:17.531513 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.531197 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:23:17.531513 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.531200 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:23:17.545505 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.545475 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.626301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.626261 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:18:16 +0000 UTC" deadline="2028-01-04 17:56:18.436813066 +0000 UTC" Apr 16 16:23:17.626301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.626290 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15073h33m0.810526173s" Apr 16 16:23:17.630166 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.630144 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:23:17.643518 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.643494 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:23:17.646337 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.646317 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.669535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.669506 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n5wbg" Apr 16 16:23:17.675644 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.675623 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n5wbg" Apr 16 16:23:17.703303 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.703225 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:17.746777 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:17.746746 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-125.ec2.internal\" not found" Apr 16 16:23:17.763581 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:17.761801 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3daeaf9dae8edeea4bbaed1ffe567636.slice/crio-df3e065e8e847bbbc0c50c601926bf53c93e59eb25996f038d145f3ff9e8718b WatchSource:0}: Error finding container df3e065e8e847bbbc0c50c601926bf53c93e59eb25996f038d145f3ff9e8718b: Status 404 returned error can't find the container with id df3e065e8e847bbbc0c50c601926bf53c93e59eb25996f038d145f3ff9e8718b Apr 16 16:23:17.764446 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:17.764421 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3608a57f147686cc88eda8f9fac573e3.slice/crio-bf00ba667a5e56622d526aecaac6ef55b7b46fda6db4e291a0647896e7b427f5 WatchSource:0}: Error finding container bf00ba667a5e56622d526aecaac6ef55b7b46fda6db4e291a0647896e7b427f5: Status 404 returned error can't find the container with id bf00ba667a5e56622d526aecaac6ef55b7b46fda6db4e291a0647896e7b427f5 Apr 16 16:23:17.767030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.767017 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:23:17.791724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.791697 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:17.831388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.831357 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.840402 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.840381 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:23:17.841330 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.841316 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" Apr 16 16:23:17.852014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:17.851991 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:23:18.562322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.562298 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:18.608467 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.608431 2571 apiserver.go:52] "Watching apiserver" Apr 16 16:23:18.624177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.624140 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:23:18.624551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.624531 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-9q8hb","openshift-ovn-kubernetes/ovnkube-node-vm8pb","kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45","openshift-cluster-node-tuning-operator/tuned-r58s4","openshift-image-registry/node-ca-7th94","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal","openshift-multus/multus-additional-cni-plugins-f9km4","openshift-multus/network-metrics-daemon-mtw25","openshift-network-operator/iptables-alerter-2722r","kube-system/konnectivity-agent-p2krm","openshift-dns/node-resolver-9q4lm","openshift-multus/multus-7gghf"] Apr 16 16:23:18.626733 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.626712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.628157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.628133 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.629635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.629384 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.630631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.630607 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.631808 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.631789 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.632291 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.632275 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:23:18.632395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.632280 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:23:18.632395 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.632281 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.633099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.633062 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.633282 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.633218 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mdr2r\"" Apr 16 16:23:18.633539 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.633371 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:23:18.633539 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.633061 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:23:18.634156 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.634134 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:23:18.634530 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.634431 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.635388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.635139 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:23:18.635388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.635182 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.635388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.635285 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8hgpb\"" Apr 16 16:23:18.638200 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.637373 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:18.638200 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.637468 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:18.639217 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.639197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.639316 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.639267 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:18.639376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.639327 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.640759 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.640739 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.642155 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.642139 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.644325 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-sys\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.644420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-config\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644358 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-script-lib\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4093fb1-792a-4c35-b82a-7713709a1a78-iptables-alerter-script\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.644575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bbnp\" (UniqueName: \"kubernetes.io/projected/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-kube-api-access-4bbnp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.644575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-os-release\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.644575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-netd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.644724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-tmp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.644724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644676 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ea3158-fb50-45bd-a3ff-a9af8b130de9-host\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.644724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ea3158-fb50-45bd-a3ff-a9af8b130de9-serviceca\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644773 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-host\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cnibin\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-node-log\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnp2\" (UniqueName: \"kubernetes.io/projected/5c076e87-1778-44fa-9253-5a9e0c898f3b-kube-api-access-fdnp2\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644896 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x25k\" (UniqueName: \"kubernetes.io/projected/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kube-api-access-2x25k\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644942 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtp6s\" (UniqueName: \"kubernetes.io/projected/073e645f-92a9-4855-9057-6a125ec9ebda-kube-api-access-qtp6s\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.644972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-var-lib-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.644988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-modprobe-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645003 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qtjd\" (UniqueName: \"kubernetes.io/projected/93ea3158-fb50-45bd-a3ff-a9af8b130de9-kube-api-access-5qtjd\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-env-overrides\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-run\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645082 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-var-lib-kubelet\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645271 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-ovn\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645311 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-device-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-socket-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.645376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-sys-fs\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfvx5\" (UniqueName: \"kubernetes.io/projected/f4093fb1-792a-4c35-b82a-7713709a1a78-kube-api-access-tfvx5\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645427 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-lib-modules\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645458 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-system-cni-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-etc-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645541 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-registration-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645613 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-kubernetes\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645630 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-kubelet\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645645 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4093fb1-792a-4c35-b82a-7713709a1a78-host-slash\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645720 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645764 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-systemd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-systemd\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-tuned\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-slash\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.645951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645853 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-conf\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645920 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-bin\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysconfig\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.645984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrl6g\" (UniqueName: \"kubernetes.io/projected/6e96b3ac-b7d4-44c4-92c5-7706938e5538-kube-api-access-jrl6g\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.646007 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-systemd-units\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.646031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-log-socket\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.646052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-netns\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.646696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.646074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.649270 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.649207 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.649270 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.649259 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.649420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.649207 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650078 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650111 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650150 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gsv58\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650205 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650282 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650391 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:23:18.650504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.650879 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650592 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.650879 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.650704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9hv2k\"" Apr 16 16:23:18.651272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.651252 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t7wn4\"" Apr 16 16:23:18.651272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.651268 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.651393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.651362 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:23:18.654577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.654254 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:23:18.654577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.654255 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.654749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.654737 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:23:18.655256 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655231 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-dnx9b\"" Apr 16 16:23:18.655256 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655249 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hd8q2\"" Apr 16 16:23:18.655410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-nx66x\"" Apr 16 16:23:18.655410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gc99q\"" Apr 16 16:23:18.655513 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655473 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:23:18.655560 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.655513 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:23:18.677023 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.676998 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:18:17 +0000 UTC" deadline="2027-12-26 11:07:47.412072436 +0000 UTC" Apr 16 16:23:18.677023 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.677021 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14850h44m28.735053485s" Apr 16 16:23:18.734886 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.734848 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:23:18.746634 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746597 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-kubelet\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.746813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ea3158-fb50-45bd-a3ff-a9af8b130de9-host\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.746813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ea3158-fb50-45bd-a3ff-a9af8b130de9-serviceca\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.746813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.746813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-host\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.746813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/93ea3158-fb50-45bd-a3ff-a9af8b130de9-host\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-netns\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746849 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cnibin\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-node-log\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746900 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnp2\" (UniqueName: \"kubernetes.io/projected/5c076e87-1778-44fa-9253-5a9e0c898f3b-kube-api-access-fdnp2\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-etc-selinux\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cnibin\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x25k\" (UniqueName: \"kubernetes.io/projected/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kube-api-access-2x25k\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-node-log\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.746977 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtp6s\" (UniqueName: \"kubernetes.io/projected/073e645f-92a9-4855-9057-6a125ec9ebda-kube-api-access-qtp6s\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.747053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747031 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-host\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-konnectivity-ca\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747141 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-multus-daemon-config\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747171 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-var-lib-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-modprobe-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747235 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-var-lib-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-socket-dir-parent\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747279 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qtjd\" (UniqueName: \"kubernetes.io/projected/93ea3158-fb50-45bd-a3ff-a9af8b130de9-kube-api-access-5qtjd\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-env-overrides\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-run\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/93ea3158-fb50-45bd-a3ff-a9af8b130de9-serviceca\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-var-lib-kubelet\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-modprobe-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2bnr\" (UniqueName: \"kubernetes.io/projected/037863f7-80dd-4ea9-9735-c27f4d903d1d-kube-api-access-d2bnr\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747415 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-run\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747438 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-var-lib-kubelet\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.747503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-ovn\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-device-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747596 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-multus-certs\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-ovn\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/037863f7-80dd-4ea9-9735-c27f4d903d1d-tmp-dir\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747648 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-socket-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-device-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-sys-fs\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747701 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfvx5\" (UniqueName: \"kubernetes.io/projected/f4093fb1-792a-4c35-b82a-7713709a1a78-kube-api-access-tfvx5\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-sys-fs\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-conf-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747770 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-env-overrides\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-lib-modules\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747808 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-socket-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.748267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-system-cni-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747883 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-etc-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-system-cni-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-lib-modules\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-etc-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747970 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.747985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-registration-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748015 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-kubernetes\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748048 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-openvswitch\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-kubernetes\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-kubelet\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-registration-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748128 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748177 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-kubelet\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748239 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4093fb1-792a-4c35-b82a-7713709a1a78-host-slash\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748276 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748320 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-k8s-cni-cncf-io\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748384 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-multus\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-systemd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-systemd\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-tuned\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-cnibin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-os-release\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-cni-binary-copy\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-bin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-slash\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748646 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e96b3ac-b7d4-44c4-92c5-7706938e5538-cni-binary-copy\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-conf\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.749863 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/037863f7-80dd-4ea9-9735-c27f4d903d1d-hosts-file\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748740 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-bin\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysconfig\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-etc-kubernetes\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrl6g\" (UniqueName: \"kubernetes.io/projected/6e96b3ac-b7d4-44c4-92c5-7706938e5538-kube-api-access-jrl6g\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748874 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-systemd-units\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748870 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-log-socket\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748950 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-agent-certs\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748988 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-netns\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-sys\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-config\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-script-lib\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749133 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4093fb1-792a-4c35-b82a-7713709a1a78-iptables-alerter-script\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bbnp\" (UniqueName: \"kubernetes.io/projected/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-kube-api-access-4bbnp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.750659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-system-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.748296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f4093fb1-792a-4c35-b82a-7713709a1a78-host-slash\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-hostroot\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749255 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jptn8\" (UniqueName: \"kubernetes.io/projected/6840e957-0163-4053-b7e6-599a98718065-kube-api-access-jptn8\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-os-release\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-netd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749367 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749374 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-systemd-units\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-systemd\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749396 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-tmp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-log-socket\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-netns\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-run-systemd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749822 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-sys\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-bin\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749937 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysconfig\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749948 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-conf\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.751450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.749979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-run-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750013 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f4093fb1-792a-4c35-b82a-7713709a1a78-iptables-alerter-script\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-cni-netd\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e96b3ac-b7d4-44c4-92c5-7706938e5538-os-release\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750150 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-sysctl-d\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-config\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.750232 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750258 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c076e87-1778-44fa-9253-5a9e0c898f3b-host-slash\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.750295 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:19.250276694 +0000 UTC m=+3.085129602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.750331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovnkube-script-lib\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.752018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-etc-tuned\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.752157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.752127 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-tmp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.752648 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.752260 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c076e87-1778-44fa-9253-5a9e0c898f3b-ovn-node-metrics-cert\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.756039 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.755807 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" event={"ID":"3608a57f147686cc88eda8f9fac573e3","Type":"ContainerStarted","Data":"bf00ba667a5e56622d526aecaac6ef55b7b46fda6db4e291a0647896e7b427f5"} Apr 16 16:23:18.756039 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.755965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfvx5\" (UniqueName: \"kubernetes.io/projected/f4093fb1-792a-4c35-b82a-7713709a1a78-kube-api-access-tfvx5\") pod \"iptables-alerter-2722r\" (UID: \"f4093fb1-792a-4c35-b82a-7713709a1a78\") " pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.756239 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.756201 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qtjd\" (UniqueName: \"kubernetes.io/projected/93ea3158-fb50-45bd-a3ff-a9af8b130de9-kube-api-access-5qtjd\") pod \"node-ca-7th94\" (UID: \"93ea3158-fb50-45bd-a3ff-a9af8b130de9\") " pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.756853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.756819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtp6s\" (UniqueName: \"kubernetes.io/projected/073e645f-92a9-4855-9057-6a125ec9ebda-kube-api-access-qtp6s\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:18.757621 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.757592 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" event={"ID":"3daeaf9dae8edeea4bbaed1ffe567636","Type":"ContainerStarted","Data":"df3e065e8e847bbbc0c50c601926bf53c93e59eb25996f038d145f3ff9e8718b"} Apr 16 16:23:18.757932 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.757898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnp2\" (UniqueName: \"kubernetes.io/projected/5c076e87-1778-44fa-9253-5a9e0c898f3b-kube-api-access-fdnp2\") pod \"ovnkube-node-vm8pb\" (UID: \"5c076e87-1778-44fa-9253-5a9e0c898f3b\") " pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.758192 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.758157 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x25k\" (UniqueName: \"kubernetes.io/projected/7956710a-e3ab-4bc3-98eb-86be6f9ab5f1-kube-api-access-2x25k\") pod \"aws-ebs-csi-driver-node-hpk45\" (UID: \"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.758630 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.758611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bbnp\" (UniqueName: \"kubernetes.io/projected/c41ee48a-e153-4f75-94c9-5bb11b0dbfed-kube-api-access-4bbnp\") pod \"tuned-r58s4\" (UID: \"c41ee48a-e153-4f75-94c9-5bb11b0dbfed\") " pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.759626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.759605 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrl6g\" (UniqueName: \"kubernetes.io/projected/6e96b3ac-b7d4-44c4-92c5-7706938e5538-kube-api-access-jrl6g\") pod \"multus-additional-cni-plugins-f9km4\" (UID: \"6e96b3ac-b7d4-44c4-92c5-7706938e5538\") " pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.850056 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.849971 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-konnectivity-ca\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.850056 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-multus-daemon-config\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-socket-dir-parent\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2bnr\" (UniqueName: \"kubernetes.io/projected/037863f7-80dd-4ea9-9735-c27f4d903d1d-kube-api-access-d2bnr\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850187 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-multus-certs\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850199 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-socket-dir-parent\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/037863f7-80dd-4ea9-9735-c27f4d903d1d-tmp-dir\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-conf-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850312 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-k8s-cni-cncf-io\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-multus\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-cnibin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-os-release\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850411 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-cni-binary-copy\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-bin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/037863f7-80dd-4ea9-9735-c27f4d903d1d-hosts-file\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850486 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-conf-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-etc-kubernetes\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850527 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-agent-certs\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850535 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-etc-kubernetes\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850551 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-system-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-hostroot\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-os-release\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jptn8\" (UniqueName: \"kubernetes.io/projected/6840e957-0163-4053-b7e6-599a98718065-kube-api-access-jptn8\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.850626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-konnectivity-ca\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850646 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-kubelet\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850669 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/037863f7-80dd-4ea9-9735-c27f4d903d1d-tmp-dir\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850684 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-netns\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850686 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-kubelet\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-k8s-cni-cncf-io\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850688 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-multus-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850727 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-hostroot\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850734 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-multus-daemon-config\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-multus-certs\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-multus\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850737 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-system-cni-dir\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-var-lib-cni-bin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850784 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/037863f7-80dd-4ea9-9735-c27f4d903d1d-hosts-file\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850791 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-cnibin\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.850882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6840e957-0163-4053-b7e6-599a98718065-host-run-netns\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.851267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.851242 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6840e957-0163-4053-b7e6-599a98718065-cni-binary-copy\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.853594 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.853556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fa936c5-e151-4f22-8ab2-c2bc28919d4b-agent-certs\") pod \"konnectivity-agent-p2krm\" (UID: \"6fa936c5-e151-4f22-8ab2-c2bc28919d4b\") " pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.856571 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.856548 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:18.856571 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.856574 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:18.856757 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.856588 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:18.856757 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:18.856659 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:19.356639042 +0000 UTC m=+3.191491937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:18.858464 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.858435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2bnr\" (UniqueName: \"kubernetes.io/projected/037863f7-80dd-4ea9-9735-c27f4d903d1d-kube-api-access-d2bnr\") pod \"node-resolver-9q4lm\" (UID: \"037863f7-80dd-4ea9-9735-c27f4d903d1d\") " pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:18.859314 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.859292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jptn8\" (UniqueName: \"kubernetes.io/projected/6840e957-0163-4053-b7e6-599a98718065-kube-api-access-jptn8\") pod \"multus-7gghf\" (UID: \"6840e957-0163-4053-b7e6-599a98718065\") " pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.939808 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.939765 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2722r" Apr 16 16:23:18.946644 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.946615 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:18.955535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.955509 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" Apr 16 16:23:18.966252 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.966227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r58s4" Apr 16 16:23:18.972842 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.972816 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7th94" Apr 16 16:23:18.979559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.979540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f9km4" Apr 16 16:23:18.985153 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.985131 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:18.991739 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.991712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7gghf" Apr 16 16:23:18.997339 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:18.997312 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9q4lm" Apr 16 16:23:19.040589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.040558 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:23:19.253932 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.253850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:19.254067 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.254022 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:19.254109 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.254100 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:20.254078913 +0000 UTC m=+4.088931799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:19.400434 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.400403 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e96b3ac_b7d4_44c4_92c5_7706938e5538.slice/crio-7494f090d8a6bc24054629ff2d1abb9ec929e314d6af25d31ee0218900a43cb9 WatchSource:0}: Error finding container 7494f090d8a6bc24054629ff2d1abb9ec929e314d6af25d31ee0218900a43cb9: Status 404 returned error can't find the container with id 7494f090d8a6bc24054629ff2d1abb9ec929e314d6af25d31ee0218900a43cb9 Apr 16 16:23:19.402078 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.402047 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6840e957_0163_4053_b7e6_599a98718065.slice/crio-e836484f32d4446e587e478f95cec2045ed5d3caf10e897f1d7a04f578bfbdda WatchSource:0}: Error finding container e836484f32d4446e587e478f95cec2045ed5d3caf10e897f1d7a04f578bfbdda: Status 404 returned error can't find the container with id e836484f32d4446e587e478f95cec2045ed5d3caf10e897f1d7a04f578bfbdda Apr 16 16:23:19.405617 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.405588 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4093fb1_792a_4c35_b82a_7713709a1a78.slice/crio-334fe59c98bc1410209eea58613933250f247b6058ae66c250d04067da2d1abd WatchSource:0}: Error finding container 334fe59c98bc1410209eea58613933250f247b6058ae66c250d04067da2d1abd: Status 404 returned error can't find the container with id 334fe59c98bc1410209eea58613933250f247b6058ae66c250d04067da2d1abd Apr 16 16:23:19.406435 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.406415 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc41ee48a_e153_4f75_94c9_5bb11b0dbfed.slice/crio-028e54842411b315c679ef14e1d70bcc5af9d25d3bc4f9fd7256dc2174406562 WatchSource:0}: Error finding container 028e54842411b315c679ef14e1d70bcc5af9d25d3bc4f9fd7256dc2174406562: Status 404 returned error can't find the container with id 028e54842411b315c679ef14e1d70bcc5af9d25d3bc4f9fd7256dc2174406562 Apr 16 16:23:19.407539 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.407370 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c076e87_1778_44fa_9253_5a9e0c898f3b.slice/crio-51b8274d0744137b2dddf96ac179da24759b5ef9d34fffa9b1cc6eafb002b0b5 WatchSource:0}: Error finding container 51b8274d0744137b2dddf96ac179da24759b5ef9d34fffa9b1cc6eafb002b0b5: Status 404 returned error can't find the container with id 51b8274d0744137b2dddf96ac179da24759b5ef9d34fffa9b1cc6eafb002b0b5 Apr 16 16:23:19.408508 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.408244 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ea3158_fb50_45bd_a3ff_a9af8b130de9.slice/crio-b8539690e6e8280ebd576752275285e46a0d31c4282ef8172d4ec094e8dd7caf WatchSource:0}: Error finding container b8539690e6e8280ebd576752275285e46a0d31c4282ef8172d4ec094e8dd7caf: Status 404 returned error can't find the container with id b8539690e6e8280ebd576752275285e46a0d31c4282ef8172d4ec094e8dd7caf Apr 16 16:23:19.409179 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:23:19.409043 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037863f7_80dd_4ea9_9735_c27f4d903d1d.slice/crio-8df371dc2d4849763731d471f9865e3dbd372222eb109bd0a6aef7d0c913763c WatchSource:0}: Error finding container 8df371dc2d4849763731d471f9865e3dbd372222eb109bd0a6aef7d0c913763c: Status 404 returned error can't find the container with id 8df371dc2d4849763731d471f9865e3dbd372222eb109bd0a6aef7d0c913763c Apr 16 16:23:19.455939 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.455736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:19.456049 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.455895 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:19.456049 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.455985 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:19.456049 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.455998 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:19.456187 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:19.456053 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:20.456037852 +0000 UTC m=+4.290890736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:19.678193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.678076 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:18:17 +0000 UTC" deadline="2027-12-11 07:45:25.338165653 +0000 UTC" Apr 16 16:23:19.678193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.678133 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14487h22m5.660053835s" Apr 16 16:23:19.759682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.759641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" event={"ID":"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1","Type":"ContainerStarted","Data":"79f7387a20ecefda4b43aa2f812db6ed05677511e2b62cd88698682a1c770123"} Apr 16 16:23:19.760914 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.760881 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9q4lm" event={"ID":"037863f7-80dd-4ea9-9735-c27f4d903d1d","Type":"ContainerStarted","Data":"8df371dc2d4849763731d471f9865e3dbd372222eb109bd0a6aef7d0c913763c"} Apr 16 16:23:19.761960 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.761936 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7th94" event={"ID":"93ea3158-fb50-45bd-a3ff-a9af8b130de9","Type":"ContainerStarted","Data":"b8539690e6e8280ebd576752275285e46a0d31c4282ef8172d4ec094e8dd7caf"} Apr 16 16:23:19.763074 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.763052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"51b8274d0744137b2dddf96ac179da24759b5ef9d34fffa9b1cc6eafb002b0b5"} Apr 16 16:23:19.764097 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.764076 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2722r" event={"ID":"f4093fb1-792a-4c35-b82a-7713709a1a78","Type":"ContainerStarted","Data":"334fe59c98bc1410209eea58613933250f247b6058ae66c250d04067da2d1abd"} Apr 16 16:23:19.765381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.765348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gghf" event={"ID":"6840e957-0163-4053-b7e6-599a98718065","Type":"ContainerStarted","Data":"e836484f32d4446e587e478f95cec2045ed5d3caf10e897f1d7a04f578bfbdda"} Apr 16 16:23:19.766452 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.766423 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r58s4" event={"ID":"c41ee48a-e153-4f75-94c9-5bb11b0dbfed","Type":"ContainerStarted","Data":"028e54842411b315c679ef14e1d70bcc5af9d25d3bc4f9fd7256dc2174406562"} Apr 16 16:23:19.767595 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.767570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerStarted","Data":"7494f090d8a6bc24054629ff2d1abb9ec929e314d6af25d31ee0218900a43cb9"} Apr 16 16:23:19.769381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.769238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" event={"ID":"3daeaf9dae8edeea4bbaed1ffe567636","Type":"ContainerStarted","Data":"ca9c1d5883e2252718b3ba864fb03297820d872538a5e2cd6c7601719618f18d"} Apr 16 16:23:19.770424 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.770402 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p2krm" event={"ID":"6fa936c5-e151-4f22-8ab2-c2bc28919d4b","Type":"ContainerStarted","Data":"0079fae3d9c6daefedfa3558a500f33e7213148cba1c6fe9d4d78f727408ae79"} Apr 16 16:23:19.787401 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:19.787340 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-125.ec2.internal" podStartSLOduration=2.78732245 podStartE2EDuration="2.78732245s" podCreationTimestamp="2026-04-16 16:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:23:19.786850875 +0000 UTC m=+3.621703801" watchObservedRunningTime="2026-04-16 16:23:19.78732245 +0000 UTC m=+3.622175355" Apr 16 16:23:20.261601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.261562 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:20.261787 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.261705 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:20.261787 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.261768 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:22.261749482 +0000 UTC m=+6.096602377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:20.463869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.463189 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:20.463869 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.463357 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:20.463869 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.463376 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:20.463869 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.463389 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:20.463869 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.463446 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:22.463428317 +0000 UTC m=+6.298281206 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:20.757708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.757634 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:20.758167 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.757758 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:20.758392 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.758367 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:20.758502 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:20.758482 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:20.801935 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.801874 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" event={"ID":"3608a57f147686cc88eda8f9fac573e3","Type":"ContainerDied","Data":"b8edf4fca562372d6a1f7bc0dc2f8d1574d9b7b76afedb3083ee128843c61bda"} Apr 16 16:23:20.802163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:20.801535 2571 generic.go:358] "Generic (PLEG): container finished" podID="3608a57f147686cc88eda8f9fac573e3" containerID="b8edf4fca562372d6a1f7bc0dc2f8d1574d9b7b76afedb3083ee128843c61bda" exitCode=0 Apr 16 16:23:21.814322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:21.814271 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" event={"ID":"3608a57f147686cc88eda8f9fac573e3","Type":"ContainerStarted","Data":"74a2c8ff33b1bc08a6c60e4c392bf2759221a455b447fcaa3baf82b19bf9bc37"} Apr 16 16:23:22.286708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:22.286668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:22.286898 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.286841 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:22.286961 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.286906 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:26.286886355 +0000 UTC m=+10.121739251 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:22.489044 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:22.488451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:22.489044 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.488643 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:22.489044 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.488662 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:22.489044 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.488671 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:22.489044 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.488716 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:26.488703005 +0000 UTC m=+10.323555887 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:22.750663 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:22.750510 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:22.750663 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:22.750534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:22.750663 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.750651 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:22.750924 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:22.750741 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:24.754845 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:24.754185 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:24.754845 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:24.754302 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:24.754845 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:24.754693 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:24.754845 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:24.754804 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:26.323405 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:26.323361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:26.323881 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.323552 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:26.323881 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.323617 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:34.323597515 +0000 UTC m=+18.158450401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:26.525038 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:26.524996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:26.525228 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.525196 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:26.525228 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.525225 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:26.525320 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.525240 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:26.525320 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.525304 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:34.525282016 +0000 UTC m=+18.360134921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:26.754253 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:26.754173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:26.754253 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:26.754197 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:26.754437 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.754296 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:26.754437 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:26.754379 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:28.751786 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:28.751545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:28.751786 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:28.751595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:28.751786 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:28.751757 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:28.752282 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:28.751880 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:30.751432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:30.751399 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:30.751432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:30.751420 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:30.751950 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:30.751509 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:30.751950 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:30.751633 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:32.750725 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:32.750685 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:32.751171 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:32.750731 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:32.751171 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:32.750837 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:32.751171 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:32.750983 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:34.377607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:34.377556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:34.378015 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.377743 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:34.378015 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.377813 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:23:50.377791774 +0000 UTC m=+34.212644656 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:34.579920 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:34.579876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:34.580108 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.580087 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:34.580177 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.580112 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:34.580177 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.580136 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:34.580241 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.580191 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:23:50.580176972 +0000 UTC m=+34.415029859 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:34.750962 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:34.750882 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:34.751091 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:34.750882 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:34.751091 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.750995 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:34.751091 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:34.751058 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:36.754154 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.753750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:36.754154 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.753817 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:36.763883 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:36.760963 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:36.763883 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:36.761107 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:36.840420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.840381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p2krm" event={"ID":"6fa936c5-e151-4f22-8ab2-c2bc28919d4b","Type":"ContainerStarted","Data":"0e7626a8c0d9c53352e3faa8f0ab352fc5a826208088de625dced9bcfd66a877"} Apr 16 16:23:36.842518 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.842278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" event={"ID":"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1","Type":"ContainerStarted","Data":"5a5a83b159c20b76d3aba6d3400d0b11db012ba1985c23f6b062a2c9d41c2dcf"} Apr 16 16:23:36.845416 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.845393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"821c0a2c5a24cfbe7a68a348cd6b330865d2d48792ff35d66487409918b8abe2"} Apr 16 16:23:36.846750 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.846730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7gghf" event={"ID":"6840e957-0163-4053-b7e6-599a98718065","Type":"ContainerStarted","Data":"824bde6d9b8d284a4c2d895c9b25e55b3f13a137b0b2ca9beac9723ede38e4e2"} Apr 16 16:23:36.848086 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.848066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r58s4" event={"ID":"c41ee48a-e153-4f75-94c9-5bb11b0dbfed","Type":"ContainerStarted","Data":"5eb5a717acd49a067d0745e8fc9082b979d5d281dfa663ae4bf18a9b1d85d97a"} Apr 16 16:23:36.849174 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.849153 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerStarted","Data":"c7fedc13badfbc1bd2127be3e10f7ecc86b672b348d5b5e627065cf765c64d96"} Apr 16 16:23:36.857410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.857364 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-125.ec2.internal" podStartSLOduration=19.857346499 podStartE2EDuration="19.857346499s" podCreationTimestamp="2026-04-16 16:23:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:23:21.830472139 +0000 UTC m=+5.665325046" watchObservedRunningTime="2026-04-16 16:23:36.857346499 +0000 UTC m=+20.692199405" Apr 16 16:23:36.857753 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.857723 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p2krm" podStartSLOduration=11.722262413 podStartE2EDuration="20.857716035s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.41321223 +0000 UTC m=+3.248065111" lastFinishedPulling="2026-04-16 16:23:28.548665835 +0000 UTC m=+12.383518733" observedRunningTime="2026-04-16 16:23:36.856465924 +0000 UTC m=+20.691318829" watchObservedRunningTime="2026-04-16 16:23:36.857716035 +0000 UTC m=+20.692568938" Apr 16 16:23:36.898791 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.898750 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7gghf" podStartSLOduration=3.7776879819999998 podStartE2EDuration="20.898735997s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.40436568 +0000 UTC m=+3.239218562" lastFinishedPulling="2026-04-16 16:23:36.525413689 +0000 UTC m=+20.360266577" observedRunningTime="2026-04-16 16:23:36.898402082 +0000 UTC m=+20.733254985" watchObservedRunningTime="2026-04-16 16:23:36.898735997 +0000 UTC m=+20.733588901" Apr 16 16:23:36.916798 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:36.916753 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r58s4" podStartSLOduration=3.800226066 podStartE2EDuration="20.916735592s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.408264238 +0000 UTC m=+3.243117120" lastFinishedPulling="2026-04-16 16:23:36.524773742 +0000 UTC m=+20.359626646" observedRunningTime="2026-04-16 16:23:36.916470028 +0000 UTC m=+20.751322957" watchObservedRunningTime="2026-04-16 16:23:36.916735592 +0000 UTC m=+20.751588496" Apr 16 16:23:37.421915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.421680 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:37.422376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.422340 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:37.852761 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.852725 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="c7fedc13badfbc1bd2127be3e10f7ecc86b672b348d5b5e627065cf765c64d96" exitCode=0 Apr 16 16:23:37.853576 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.852814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"c7fedc13badfbc1bd2127be3e10f7ecc86b672b348d5b5e627065cf765c64d96"} Apr 16 16:23:37.854102 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.854042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9q4lm" event={"ID":"037863f7-80dd-4ea9-9735-c27f4d903d1d","Type":"ContainerStarted","Data":"cebd0f53255f2ae84bbc2e7ce19a6f155d6f59a03d0f6a4c0e3fe99f91c11abd"} Apr 16 16:23:37.858141 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.858033 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7th94" event={"ID":"93ea3158-fb50-45bd-a3ff-a9af8b130de9","Type":"ContainerStarted","Data":"154c0a5b6cca17cd1b309952675a9b97c555b5f29ced958f3a9736a12d52d7b2"} Apr 16 16:23:37.860267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860248 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:23:37.860549 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860530 2571 generic.go:358] "Generic (PLEG): container finished" podID="5c076e87-1778-44fa-9253-5a9e0c898f3b" containerID="fa4b9d57a560adf85b5852dc99f36ab6b07531bb9c2ffc5c70ed3b4b7d4f004b" exitCode=1 Apr 16 16:23:37.860686 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"3f7b24f96d7f45b74fec35ce828a925dfcaa4c259f9688ce94281ec2aea41671"} Apr 16 16:23:37.860760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860699 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"1d7c983f8993cc57fdd7a78d68d1257c6ebb72fb83fb5dd0b9ec43c87cf0d92a"} Apr 16 16:23:37.860760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860712 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"ffd51259b16cdc296d50753f81817be26e9832a6c9985aaf519f51d25a672d26"} Apr 16 16:23:37.860760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860725 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"9d422b7a5bafa822eeb2a2703b0d90226598bcbcf4b502c38ade091704196b14"} Apr 16 16:23:37.860760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.860737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerDied","Data":"fa4b9d57a560adf85b5852dc99f36ab6b07531bb9c2ffc5c70ed3b4b7d4f004b"} Apr 16 16:23:37.890864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.890816 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9q4lm" podStartSLOduration=3.968367904 podStartE2EDuration="20.890802482s" podCreationTimestamp="2026-04-16 16:23:17 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.411756234 +0000 UTC m=+3.246609128" lastFinishedPulling="2026-04-16 16:23:36.334190805 +0000 UTC m=+20.169043706" observedRunningTime="2026-04-16 16:23:37.890743547 +0000 UTC m=+21.725596459" watchObservedRunningTime="2026-04-16 16:23:37.890802482 +0000 UTC m=+21.725655386" Apr 16 16:23:37.905678 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:37.905634 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7th94" podStartSLOduration=4.956627032 podStartE2EDuration="21.905620428s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.411604735 +0000 UTC m=+3.246457629" lastFinishedPulling="2026-04-16 16:23:36.360598144 +0000 UTC m=+20.195451025" observedRunningTime="2026-04-16 16:23:37.905179664 +0000 UTC m=+21.740032771" watchObservedRunningTime="2026-04-16 16:23:37.905620428 +0000 UTC m=+21.740473332" Apr 16 16:23:38.236047 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.236014 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:23:38.718984 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.718869 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:23:38.236038247Z","UUID":"09908476-8ec4-485e-870a-ac008a4fa722","Handler":null,"Name":"","Endpoint":""} Apr 16 16:23:38.721271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.721246 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:23:38.721441 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.721278 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:23:38.751244 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.751215 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:38.751417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.751250 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:38.751417 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:38.751353 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:38.751778 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:38.751750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:38.865362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.865324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" event={"ID":"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1","Type":"ContainerStarted","Data":"9e1f45ce7628c653aa1d232dc15dbed9534d07ff83947727875dd7bbc67c6dbe"} Apr 16 16:23:38.867209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.867186 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:23:38.867352 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.867261 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2722r" event={"ID":"f4093fb1-792a-4c35-b82a-7713709a1a78","Type":"ContainerStarted","Data":"574f770edf855a3bb8f08e0d965d66661f49750fa2778f440751a9a93280a5b9"} Apr 16 16:23:38.883755 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:38.883694 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2722r" podStartSLOduration=6.081087227 podStartE2EDuration="22.88367864s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.407132741 +0000 UTC m=+3.241985628" lastFinishedPulling="2026-04-16 16:23:36.209724142 +0000 UTC m=+20.044577041" observedRunningTime="2026-04-16 16:23:38.88286735 +0000 UTC m=+22.717720256" watchObservedRunningTime="2026-04-16 16:23:38.88367864 +0000 UTC m=+22.718531544" Apr 16 16:23:39.874767 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:39.874673 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" event={"ID":"7956710a-e3ab-4bc3-98eb-86be6f9ab5f1","Type":"ContainerStarted","Data":"04b11130be416ce9bf44bc5bee81c97f57e3c330a782bb95119e8cc2cfba8707"} Apr 16 16:23:39.877751 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:39.877731 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:23:39.878162 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:39.878129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"e4351edbc09869bbe139c0cce4b2d843ebaf551ebb9ce67b12eb2a93be2e78a1"} Apr 16 16:23:39.894032 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:39.893977 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hpk45" podStartSLOduration=3.908834144 podStartE2EDuration="23.893958674s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.411930949 +0000 UTC m=+3.246783841" lastFinishedPulling="2026-04-16 16:23:39.397055473 +0000 UTC m=+23.231908371" observedRunningTime="2026-04-16 16:23:39.893293014 +0000 UTC m=+23.728145919" watchObservedRunningTime="2026-04-16 16:23:39.893958674 +0000 UTC m=+23.728811580" Apr 16 16:23:40.751658 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:40.751434 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:40.751862 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:40.751775 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:40.751968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:40.751464 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:40.752094 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:40.752073 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:41.885492 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:41.885464 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:23:42.751069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.750888 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:42.751265 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.750949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:42.751265 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:42.751172 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:42.751265 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:42.751206 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:42.890069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.890042 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:23:42.891773 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.890761 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"85797a924096e3a22fc76d8f5ed9c270f4477ba4603641f37203fdeaee267eed"} Apr 16 16:23:42.891773 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.891540 2571 scope.go:117] "RemoveContainer" containerID="fa4b9d57a560adf85b5852dc99f36ab6b07531bb9c2ffc5c70ed3b4b7d4f004b" Apr 16 16:23:42.891979 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.891929 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:42.891979 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.891952 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:42.891979 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.891965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:42.894496 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.894468 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="62d18ee675a32b0f39272375426b478ebdea2ca4d11811fd0977bd8ac17f8213" exitCode=0 Apr 16 16:23:42.894636 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.894509 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"62d18ee675a32b0f39272375426b478ebdea2ca4d11811fd0977bd8ac17f8213"} Apr 16 16:23:42.908368 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.908342 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:42.908448 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:42.908439 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:23:43.809366 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.809335 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mtw25"] Apr 16 16:23:43.809505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.809492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:43.809700 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:43.809620 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:43.812134 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.812094 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9q8hb"] Apr 16 16:23:43.812224 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.812206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:43.812301 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:43.812281 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:43.899762 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.899733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:23:43.900237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.900096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" event={"ID":"5c076e87-1778-44fa-9253-5a9e0c898f3b","Type":"ContainerStarted","Data":"b490cb4e6f507a89392ca526d1bd6b493ed06ff7b837710d95adee2cbddcdfc3"} Apr 16 16:23:43.901823 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.901789 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="cf76fbf46586ae573b061ddc63fe54a773866f54e75db51d192f35f8e8d864ba" exitCode=0 Apr 16 16:23:43.901931 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.901823 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"cf76fbf46586ae573b061ddc63fe54a773866f54e75db51d192f35f8e8d864ba"} Apr 16 16:23:43.928685 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:43.928637 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" podStartSLOduration=10.75343348 podStartE2EDuration="27.928623026s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.410641333 +0000 UTC m=+3.245494215" lastFinishedPulling="2026-04-16 16:23:36.585830865 +0000 UTC m=+20.420683761" observedRunningTime="2026-04-16 16:23:43.92830313 +0000 UTC m=+27.763156059" watchObservedRunningTime="2026-04-16 16:23:43.928623026 +0000 UTC m=+27.763475908" Apr 16 16:23:44.906411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:44.906178 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="5ba65fbc295873eafeb0328627118e2f5afaaaa42e42d870a349989c28e78d9a" exitCode=0 Apr 16 16:23:44.906411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:44.906265 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"5ba65fbc295873eafeb0328627118e2f5afaaaa42e42d870a349989c28e78d9a"} Apr 16 16:23:45.751253 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:45.751217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:45.751430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:45.751217 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:45.751430 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:45.751359 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:45.751522 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:45.751451 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:46.433908 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:46.433875 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:46.434344 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:46.434152 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 16:23:46.434537 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:46.434513 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p2krm" Apr 16 16:23:47.750865 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:47.750826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:47.751451 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:47.750828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:47.751451 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:47.750960 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:47.751451 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:47.751025 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:49.751353 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:49.751310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:49.752019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:49.751317 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:49.752019 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:49.751444 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9q8hb" podUID="8a6d5165-1857-4ea4-a3ef-5a344dd7e47b" Apr 16 16:23:49.752019 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:49.751535 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mtw25" podUID="073e645f-92a9-4855-9057-6a125ec9ebda" Apr 16 16:23:50.391808 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.391769 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:50.391971 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.391911 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:50.392011 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.391979 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.391964541 +0000 UTC m=+66.226817423 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:23:50.469505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.469471 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-125.ec2.internal" event="NodeReady" Apr 16 16:23:50.469684 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.469622 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:23:50.521958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.521919 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-825fq"] Apr 16 16:23:50.541232 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.541147 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-497mc"] Apr 16 16:23:50.541385 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.541336 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.543854 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.543835 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:23:50.544378 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.544359 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:23:50.544452 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.544432 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:23:50.553365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.553343 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-497mc"] Apr 16 16:23:50.553365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.553367 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-825fq"] Apr 16 16:23:50.553487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.553453 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:50.555973 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.555952 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:23:50.556077 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.556017 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:23:50.556077 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.556041 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:23:50.556077 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.556056 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:23:50.593811 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.593778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8d51d9d-8870-4740-83f4-ac61a0da4fee-config-volume\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.593972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.593818 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d51d9d-8870-4740-83f4-ac61a0da4fee-tmp-dir\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.593972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.593850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:50.593972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.593868 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.593972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.593937 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfnn\" (UniqueName: \"kubernetes.io/projected/a8d51d9d-8870-4740-83f4-ac61a0da4fee-kube-api-access-wwfnn\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.594144 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.593987 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:23:50.594144 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.594013 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:23:50.594144 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.594026 2571 projected.go:194] Error preparing data for projected volume kube-api-access-wtkcg for pod openshift-network-diagnostics/network-check-target-9q8hb: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:50.594144 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.594094 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg podName:8a6d5165-1857-4ea4-a3ef-5a344dd7e47b nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.594075971 +0000 UTC m=+66.428928853 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wtkcg" (UniqueName: "kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg") pod "network-check-target-9q8hb" (UID: "8a6d5165-1857-4ea4-a3ef-5a344dd7e47b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:23:50.694593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2zgj\" (UniqueName: \"kubernetes.io/projected/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-kube-api-access-z2zgj\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:50.694763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfnn\" (UniqueName: \"kubernetes.io/projected/a8d51d9d-8870-4740-83f4-ac61a0da4fee-kube-api-access-wwfnn\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.694763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:50.694763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8d51d9d-8870-4740-83f4-ac61a0da4fee-config-volume\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.694763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d51d9d-8870-4740-83f4-ac61a0da4fee-tmp-dir\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.694913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.694784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.694913 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.694863 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:50.694982 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.694921 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:23:51.194906923 +0000 UTC m=+35.029759805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:23:50.695182 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.695161 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d51d9d-8870-4740-83f4-ac61a0da4fee-tmp-dir\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.695234 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.695182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8d51d9d-8870-4740-83f4-ac61a0da4fee-config-volume\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.704890 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.704863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfnn\" (UniqueName: \"kubernetes.io/projected/a8d51d9d-8870-4740-83f4-ac61a0da4fee-kube-api-access-wwfnn\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:50.795910 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.795853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2zgj\" (UniqueName: \"kubernetes.io/projected/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-kube-api-access-z2zgj\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:50.796474 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.795911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:50.796474 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.796016 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:50.796474 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:50.796069 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:23:51.296055565 +0000 UTC m=+35.130908448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:23:50.807708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:50.807678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2zgj\" (UniqueName: \"kubernetes.io/projected/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-kube-api-access-z2zgj\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:51.198764 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.198711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:51.198937 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:51.198866 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:51.198937 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:51.198934 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:23:52.198917852 +0000 UTC m=+36.033770738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:23:51.299541 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.299505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:51.299728 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:51.299667 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:51.299791 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:51.299746 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:23:52.299725964 +0000 UTC m=+36.134578869 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:23:51.750781 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.750745 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:23:51.750961 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.750745 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:23:51.753532 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.753511 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:23:51.754355 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.754337 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:23:51.754442 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.754359 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:23:51.754442 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.754340 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:23:51.754543 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.754475 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wtfd7\"" Apr 16 16:23:51.923851 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.923819 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="1aa802f902adc19c22479eee4bc15362a7568894820a222cbed6ce72b34f887b" exitCode=0 Apr 16 16:23:51.923851 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:51.923855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"1aa802f902adc19c22479eee4bc15362a7568894820a222cbed6ce72b34f887b"} Apr 16 16:23:52.206499 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:52.206468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:52.206655 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:52.206578 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:52.206655 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:52.206630 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:23:54.206616015 +0000 UTC m=+38.041468897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:23:52.307212 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:52.307110 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:52.307324 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:52.307271 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:52.307363 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:52.307342 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:23:54.307326662 +0000 UTC m=+38.142179544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:23:52.928383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:52.928349 2571 generic.go:358] "Generic (PLEG): container finished" podID="6e96b3ac-b7d4-44c4-92c5-7706938e5538" containerID="40d5d0ac95a2aba6411d17eb5cb76a23a505729d9e1f4054b7c6de91b5f6f0c7" exitCode=0 Apr 16 16:23:52.928722 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:52.928387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerDied","Data":"40d5d0ac95a2aba6411d17eb5cb76a23a505729d9e1f4054b7c6de91b5f6f0c7"} Apr 16 16:23:53.933468 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:53.933290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f9km4" event={"ID":"6e96b3ac-b7d4-44c4-92c5-7706938e5538","Type":"ContainerStarted","Data":"bb3bc0cd6fbaf7bb42e7c931454b0a2938a1033deb9537823e348b680b350cb4"} Apr 16 16:23:53.955610 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:53.955558 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f9km4" podStartSLOduration=6.599105642 podStartE2EDuration="37.955544732s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:23:19.403485771 +0000 UTC m=+3.238338654" lastFinishedPulling="2026-04-16 16:23:50.759924846 +0000 UTC m=+34.594777744" observedRunningTime="2026-04-16 16:23:53.954430232 +0000 UTC m=+37.789283131" watchObservedRunningTime="2026-04-16 16:23:53.955544732 +0000 UTC m=+37.790397636" Apr 16 16:23:54.223307 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:54.223208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:54.223457 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:54.223357 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:54.223457 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:54.223423 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:23:58.223406399 +0000 UTC m=+42.058259284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:23:54.324067 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:54.324030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:54.324237 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:54.324202 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:54.324304 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:54.324292 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:23:58.324266551 +0000 UTC m=+42.159119433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:23:58.254517 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:58.254474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:23:58.254911 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:58.254624 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:23:58.254911 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:58.254688 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:24:06.25467019 +0000 UTC m=+50.089523072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:23:58.355258 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:23:58.355203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:23:58.355436 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:58.355351 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:23:58.355436 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:23:58.355431 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:24:06.355414786 +0000 UTC m=+50.190267667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:24:06.310535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:06.310495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:24:06.311146 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:06.310638 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:06.311146 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:06.310727 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.31070586 +0000 UTC m=+66.145558747 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:24:06.411144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:06.411090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:24:06.411280 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:06.411240 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:06.411329 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:06.411318 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:24:22.411302195 +0000 UTC m=+66.246155076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:24:14.916650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:14.916622 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vm8pb" Apr 16 16:24:22.319046 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.318986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:24:22.319488 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.319188 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:24:22.319488 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.319272 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls podName:a8d51d9d-8870-4740-83f4-ac61a0da4fee nodeName:}" failed. No retries permitted until 2026-04-16 16:24:54.319253055 +0000 UTC m=+98.154105938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls") pod "dns-default-825fq" (UID: "a8d51d9d-8870-4740-83f4-ac61a0da4fee") : secret "dns-default-metrics-tls" not found Apr 16 16:24:22.420305 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.420256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:24:22.420305 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.420314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:24:22.420550 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.420431 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:24:22.420550 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.420484 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert podName:6fe47973-f1d6-4d8b-bcfc-e729c9709d5f nodeName:}" failed. No retries permitted until 2026-04-16 16:24:54.42047047 +0000 UTC m=+98.255323351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert") pod "ingress-canary-497mc" (UID: "6fe47973-f1d6-4d8b-bcfc-e729c9709d5f") : secret "canary-serving-cert" not found Apr 16 16:24:22.423072 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.423048 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:24:22.431384 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.431359 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:24:22.431519 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:22.431425 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs podName:073e645f-92a9-4855-9057-6a125ec9ebda nodeName:}" failed. No retries permitted until 2026-04-16 16:25:26.431411253 +0000 UTC m=+130.266264134 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs") pod "network-metrics-daemon-mtw25" (UID: "073e645f-92a9-4855-9057-6a125ec9ebda") : secret "metrics-daemon-secret" not found Apr 16 16:24:22.621942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.621833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:24:22.624679 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.624662 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:24:22.635072 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.635050 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:24:22.646235 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.646214 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtkcg\" (UniqueName: \"kubernetes.io/projected/8a6d5165-1857-4ea4-a3ef-5a344dd7e47b-kube-api-access-wtkcg\") pod \"network-check-target-9q8hb\" (UID: \"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b\") " pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:24:22.666989 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.666963 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wtfd7\"" Apr 16 16:24:22.675068 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.675045 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:24:22.799285 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.799256 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9q8hb"] Apr 16 16:24:22.802904 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:22.802874 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a6d5165_1857_4ea4_a3ef_5a344dd7e47b.slice/crio-3b9da8e889fcf95098f7a30af207ee3bd6004959c13ceca30d0418789f13174b WatchSource:0}: Error finding container 3b9da8e889fcf95098f7a30af207ee3bd6004959c13ceca30d0418789f13174b: Status 404 returned error can't find the container with id 3b9da8e889fcf95098f7a30af207ee3bd6004959c13ceca30d0418789f13174b Apr 16 16:24:22.990184 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:22.990146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9q8hb" event={"ID":"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b","Type":"ContainerStarted","Data":"3b9da8e889fcf95098f7a30af207ee3bd6004959c13ceca30d0418789f13174b"} Apr 16 16:24:25.997609 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:25.997569 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9q8hb" event={"ID":"8a6d5165-1857-4ea4-a3ef-5a344dd7e47b","Type":"ContainerStarted","Data":"c94c75e5e14f016f9d12b111660f15010664083895a9da51bbb1b0855a972637"} Apr 16 16:24:25.998048 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:25.997681 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:24:26.017451 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:26.017396 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9q8hb" podStartSLOduration=67.025871619 podStartE2EDuration="1m10.017381034s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:24:22.804676135 +0000 UTC m=+66.639529017" lastFinishedPulling="2026-04-16 16:24:25.796185545 +0000 UTC m=+69.631038432" observedRunningTime="2026-04-16 16:24:26.01699153 +0000 UTC m=+69.851844434" watchObservedRunningTime="2026-04-16 16:24:26.017381034 +0000 UTC m=+69.852233939" Apr 16 16:24:36.764740 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.764632 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-n874j"] Apr 16 16:24:36.767373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.767350 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:36.767728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.767706 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s"] Apr 16 16:24:36.770061 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.770044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:24:36.770325 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.770311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" Apr 16 16:24:36.771407 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.771390 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.771611 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.771597 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:24:36.771959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.771942 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.772049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.772036 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-r9w4v\"" Apr 16 16:24:36.773140 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.773104 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.773259 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.773241 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.775537 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.775517 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8shql\"" Apr 16 16:24:36.775763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.775744 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:24:36.776293 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.776264 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-n874j"] Apr 16 16:24:36.778645 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.778624 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s"] Apr 16 16:24:36.878095 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.878063 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf"] Apr 16 16:24:36.880813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.880795 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:36.885546 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.885524 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 16:24:36.885736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.885672 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6db657d5cd-98k8d"] Apr 16 16:24:36.886404 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.886380 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 16:24:36.887328 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.887308 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gct5f\"" Apr 16 16:24:36.888318 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.888298 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:36.891178 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891156 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:24:36.891275 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891203 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:24:36.891275 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891229 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:24:36.892104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891882 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:24:36.892104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891944 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.892104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.891886 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-zfnlp\"" Apr 16 16:24:36.892370 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.892346 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.905025 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.905001 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf"] Apr 16 16:24:36.912603 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.912582 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6db657d5cd-98k8d"] Apr 16 16:24:36.927947 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.927914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-config\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:36.928086 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.927975 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzx8f\" (UniqueName: \"kubernetes.io/projected/03a575af-ab57-48c7-9610-4e4bca8f14d2-kube-api-access-qzx8f\") pod \"volume-data-source-validator-7d955d5dd4-sqx2s\" (UID: \"03a575af-ab57-48c7-9610-4e4bca8f14d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" Apr 16 16:24:36.928086 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.928059 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-trusted-ca\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:36.928193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.928095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e38bc16-772b-49da-b705-b184cb60f9bd-serving-cert\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:36.928193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.928132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8m86\" (UniqueName: \"kubernetes.io/projected/2e38bc16-772b-49da-b705-b184cb60f9bd-kube-api-access-b8m86\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:36.970572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.970538 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw"] Apr 16 16:24:36.973326 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.973310 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:36.975009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.974985 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-hxp97"] Apr 16 16:24:36.977523 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.977503 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.977646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.977530 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.977646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.977538 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:24:36.977646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.977534 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-crcmn\"" Apr 16 16:24:36.977646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.977608 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:36.978861 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.978842 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:24:36.980088 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.980072 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:24:36.981364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.981344 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:24:36.981994 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.981976 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:24:36.982410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.982387 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:24:36.982410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.982406 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-m9hql\"" Apr 16 16:24:36.983953 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.983932 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw"] Apr 16 16:24:36.986875 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.986855 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:24:36.991531 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:36.991507 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-hxp97"] Apr 16 16:24:37.029259 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svqrc\" (UniqueName: \"kubernetes.io/projected/23b07769-d51e-41eb-bd71-9c987916dbd8-kube-api-access-svqrc\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.029259 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzx8f\" (UniqueName: \"kubernetes.io/projected/03a575af-ab57-48c7-9610-4e4bca8f14d2-kube-api-access-qzx8f\") pod \"volume-data-source-validator-7d955d5dd4-sqx2s\" (UID: \"03a575af-ab57-48c7-9610-4e4bca8f14d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" Apr 16 16:24:37.029425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029272 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-default-certificate\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.029425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-trusted-ca\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.029425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e38bc16-772b-49da-b705-b184cb60f9bd-serving-cert\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.029538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8m86\" (UniqueName: \"kubernetes.io/projected/2e38bc16-772b-49da-b705-b184cb60f9bd-kube-api-access-b8m86\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.029538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-stats-auth\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.029538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.029674 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5859820-c3a7-4853-87f4-1a9946dbeaa1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.029674 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029640 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-config\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.029674 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029657 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.029813 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.029682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.030413 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.030390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-trusted-ca\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.030774 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.030755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e38bc16-772b-49da-b705-b184cb60f9bd-config\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.031696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.031675 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e38bc16-772b-49da-b705-b184cb60f9bd-serving-cert\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.041923 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.041901 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8m86\" (UniqueName: \"kubernetes.io/projected/2e38bc16-772b-49da-b705-b184cb60f9bd-kube-api-access-b8m86\") pod \"console-operator-d87b8d5fc-n874j\" (UID: \"2e38bc16-772b-49da-b705-b184cb60f9bd\") " pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.042087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.042068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzx8f\" (UniqueName: \"kubernetes.io/projected/03a575af-ab57-48c7-9610-4e4bca8f14d2-kube-api-access-qzx8f\") pod \"volume-data-source-validator-7d955d5dd4-sqx2s\" (UID: \"03a575af-ab57-48c7-9610-4e4bca8f14d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" Apr 16 16:24:37.078454 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.078414 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:37.084238 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.084214 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" Apr 16 16:24:37.130529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-default-certificate\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.130529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.130529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-serving-cert\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.130529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130487 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cq2\" (UniqueName: \"kubernetes.io/projected/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-kube-api-access-85cq2\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.130529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130521 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-stats-auth\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.130588 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-snapshots\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131544 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131580 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131638 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5859820-c3a7-4853-87f4-1a9946dbeaa1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131665 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131703 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131728 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131750 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-tmp\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svqrc\" (UniqueName: \"kubernetes.io/projected/23b07769-d51e-41eb-bd71-9c987916dbd8-kube-api-access-svqrc\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.131822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fkl2\" (UniqueName: \"kubernetes.io/projected/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-kube-api-access-7fkl2\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.131974 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.132032 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.632013258 +0000 UTC m=+81.466866156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : secret "router-metrics-certs-default" not found Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.132887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b5859820-c3a7-4853-87f4-1a9946dbeaa1-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.136575 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.132994 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:37.137336 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.133039 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.633023838 +0000 UTC m=+81.467876734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:37.137336 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.133111 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.633099756 +0000 UTC m=+81.467952653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:37.137336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.134024 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-stats-auth\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.137336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.136476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-default-certificate\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.155336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.154914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svqrc\" (UniqueName: \"kubernetes.io/projected/23b07769-d51e-41eb-bd71-9c987916dbd8-kube-api-access-svqrc\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.227688 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.227657 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-n874j"] Apr 16 16:24:37.231153 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:37.231103 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e38bc16_772b_49da_b705_b184cb60f9bd.slice/crio-8a929f2303b726205b6b07ce88a8888d80e87a720a55dd5234faeeac64c4e4b5 WatchSource:0}: Error finding container 8a929f2303b726205b6b07ce88a8888d80e87a720a55dd5234faeeac64c4e4b5: Status 404 returned error can't find the container with id 8a929f2303b726205b6b07ce88a8888d80e87a720a55dd5234faeeac64c4e4b5 Apr 16 16:24:37.232193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232168 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232296 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.232296 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-tmp\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fkl2\" (UniqueName: \"kubernetes.io/projected/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-kube-api-access-7fkl2\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.232376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232358 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-serving-cert\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85cq2\" (UniqueName: \"kubernetes.io/projected/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-kube-api-access-85cq2\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232451 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.232612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232487 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-snapshots\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232893 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.232818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.232960 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.232901 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:37.233022 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.232977 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls podName:43a0dbe4-b395-4cbb-93b0-0918a13c59ec nodeName:}" failed. No retries permitted until 2026-04-16 16:24:37.732955248 +0000 UTC m=+81.567808150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jl4xw" (UID: "43a0dbe4-b395-4cbb-93b0-0918a13c59ec") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:37.233090 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.233025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.233324 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.233304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-tmp\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.233550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.233527 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.233662 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.233650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-snapshots\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.235238 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.235218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-serving-cert\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.240714 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.240696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fkl2\" (UniqueName: \"kubernetes.io/projected/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-kube-api-access-7fkl2\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.242577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.242557 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s"] Apr 16 16:24:37.244572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.244552 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cq2\" (UniqueName: \"kubernetes.io/projected/aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a-kube-api-access-85cq2\") pod \"insights-operator-5785d4fcdd-hxp97\" (UID: \"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a\") " pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.246713 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:37.246691 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03a575af_ab57_48c7_9610_4e4bca8f14d2.slice/crio-11d8d766f62b9dcd76dcfdd6e8a4a853357d6873f729d8c775a28247374ae709 WatchSource:0}: Error finding container 11d8d766f62b9dcd76dcfdd6e8a4a853357d6873f729d8c775a28247374ae709: Status 404 returned error can't find the container with id 11d8d766f62b9dcd76dcfdd6e8a4a853357d6873f729d8c775a28247374ae709 Apr 16 16:24:37.289745 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.289660 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" Apr 16 16:24:37.401322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.401291 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-hxp97"] Apr 16 16:24:37.404308 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:37.404283 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6d7ce4_f88b_42f5_b5ec_36b1c9e90e9a.slice/crio-8c3a9bdce299ddad201ffab7e0c0ff44650401cff0801465e7f6663f116f0f9c WatchSource:0}: Error finding container 8c3a9bdce299ddad201ffab7e0c0ff44650401cff0801465e7f6663f116f0f9c: Status 404 returned error can't find the container with id 8c3a9bdce299ddad201ffab7e0c0ff44650401cff0801465e7f6663f116f0f9c Apr 16 16:24:37.637188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.637082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.637188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.637181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.637205 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.637236 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.637311 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.637316 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.637297021 +0000 UTC m=+82.472149914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : secret "router-metrics-certs-default" not found Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.637366 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.637351298 +0000 UTC m=+82.472204180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:37.637374 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.637376 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.637370456 +0000 UTC m=+82.472223338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:37.738246 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:37.738206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:37.738419 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.738345 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:37.738419 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:37.738412 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls podName:43a0dbe4-b395-4cbb-93b0-0918a13c59ec nodeName:}" failed. No retries permitted until 2026-04-16 16:24:38.738393932 +0000 UTC m=+82.573246815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jl4xw" (UID: "43a0dbe4-b395-4cbb-93b0-0918a13c59ec") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:38.021026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.020992 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" event={"ID":"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a","Type":"ContainerStarted","Data":"8c3a9bdce299ddad201ffab7e0c0ff44650401cff0801465e7f6663f116f0f9c"} Apr 16 16:24:38.022399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.022340 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" event={"ID":"2e38bc16-772b-49da-b705-b184cb60f9bd","Type":"ContainerStarted","Data":"8a929f2303b726205b6b07ce88a8888d80e87a720a55dd5234faeeac64c4e4b5"} Apr 16 16:24:38.023385 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.023359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" event={"ID":"03a575af-ab57-48c7-9610-4e4bca8f14d2","Type":"ContainerStarted","Data":"11d8d766f62b9dcd76dcfdd6e8a4a853357d6873f729d8c775a28247374ae709"} Apr 16 16:24:38.646213 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.646175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:38.646394 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.646246 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:38.646394 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.646277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:38.646394 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.646365 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:38.646524 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.646428 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:38.646524 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.646447 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.646428269 +0000 UTC m=+84.481281172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:38.646524 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.646471 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.64645852 +0000 UTC m=+84.481311429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : secret "router-metrics-certs-default" not found Apr 16 16:24:38.646524 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.646485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.646477422 +0000 UTC m=+84.481330319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:38.747509 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:38.747456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:38.747691 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.747678 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:38.747759 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:38.747749 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls podName:43a0dbe4-b395-4cbb-93b0-0918a13c59ec nodeName:}" failed. No retries permitted until 2026-04-16 16:24:40.747729491 +0000 UTC m=+84.582582377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jl4xw" (UID: "43a0dbe4-b395-4cbb-93b0-0918a13c59ec") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:40.030093 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.030036 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" event={"ID":"03a575af-ab57-48c7-9610-4e4bca8f14d2","Type":"ContainerStarted","Data":"6be4ca1140c0e5966bcf0d3f52891cee4a0e25182952282bd926e73706c1cea3"} Apr 16 16:24:40.031657 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.031503 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" event={"ID":"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a","Type":"ContainerStarted","Data":"252df2a085dff4c20ff9d9518ce81c3075e002aa0924cefe4d18a0f5ae94eccc"} Apr 16 16:24:40.032861 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.032838 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" event={"ID":"2e38bc16-772b-49da-b705-b184cb60f9bd","Type":"ContainerStarted","Data":"6fffab25169c753382ac5ea095f706156802c4c73391da9a625904135bba4bf1"} Apr 16 16:24:40.033055 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.033040 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:40.034166 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.034142 2571 patch_prober.go:28] interesting pod/console-operator-d87b8d5fc-n874j container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.134.0.7:8443/readyz\": dial tcp 10.134.0.7:8443: connect: connection refused" start-of-body= Apr 16 16:24:40.034266 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.034183 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podUID="2e38bc16-772b-49da-b705-b184cb60f9bd" containerName="console-operator" probeResult="failure" output="Get \"https://10.134.0.7:8443/readyz\": dial tcp 10.134.0.7:8443: connect: connection refused" Apr 16 16:24:40.046577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.046524 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-sqx2s" podStartSLOduration=1.392893501 podStartE2EDuration="4.046508906s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:37.24862414 +0000 UTC m=+81.083477026" lastFinishedPulling="2026-04-16 16:24:39.90223955 +0000 UTC m=+83.737092431" observedRunningTime="2026-04-16 16:24:40.046096403 +0000 UTC m=+83.880949320" watchObservedRunningTime="2026-04-16 16:24:40.046508906 +0000 UTC m=+83.881361812" Apr 16 16:24:40.066153 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.066087 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podStartSLOduration=1.39244872 podStartE2EDuration="4.066007443s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:37.233238929 +0000 UTC m=+81.068091813" lastFinishedPulling="2026-04-16 16:24:39.906797654 +0000 UTC m=+83.741650536" observedRunningTime="2026-04-16 16:24:40.065433101 +0000 UTC m=+83.900286006" watchObservedRunningTime="2026-04-16 16:24:40.066007443 +0000 UTC m=+83.900860350" Apr 16 16:24:40.083904 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.083852 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" podStartSLOduration=1.584876901 podStartE2EDuration="4.083835381s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:37.406206612 +0000 UTC m=+81.241059494" lastFinishedPulling="2026-04-16 16:24:39.905165091 +0000 UTC m=+83.740017974" observedRunningTime="2026-04-16 16:24:40.081708004 +0000 UTC m=+83.916560911" watchObservedRunningTime="2026-04-16 16:24:40.083835381 +0000 UTC m=+83.918688285" Apr 16 16:24:40.665483 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.665435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:40.665483 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.665480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:40.665678 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.665589 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.665572774 +0000 UTC m=+88.500425659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:40.665719 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.665668 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:40.665753 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.665737 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.665718883 +0000 UTC m=+88.500571781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:40.665872 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.665839 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:40.665960 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.665944 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:40.666011 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.666002 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.665991633 +0000 UTC m=+88.500844515 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : secret "router-metrics-certs-default" not found Apr 16 16:24:40.766966 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:40.766926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:40.767164 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.767094 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:40.767219 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:40.767193 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls podName:43a0dbe4-b395-4cbb-93b0-0918a13c59ec nodeName:}" failed. No retries permitted until 2026-04-16 16:24:44.767169946 +0000 UTC m=+88.602022835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jl4xw" (UID: "43a0dbe4-b395-4cbb-93b0-0918a13c59ec") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:41.035672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.035591 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/0.log" Apr 16 16:24:41.035672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.035630 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e38bc16-772b-49da-b705-b184cb60f9bd" containerID="6fffab25169c753382ac5ea095f706156802c4c73391da9a625904135bba4bf1" exitCode=255 Apr 16 16:24:41.035672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.035665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" event={"ID":"2e38bc16-772b-49da-b705-b184cb60f9bd","Type":"ContainerDied","Data":"6fffab25169c753382ac5ea095f706156802c4c73391da9a625904135bba4bf1"} Apr 16 16:24:41.036265 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.035994 2571 scope.go:117] "RemoveContainer" containerID="6fffab25169c753382ac5ea095f706156802c4c73391da9a625904135bba4bf1" Apr 16 16:24:41.183157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.183110 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8"] Apr 16 16:24:41.186073 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.186050 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" Apr 16 16:24:41.214018 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.213988 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:24:41.214190 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.214098 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hnrwv\"" Apr 16 16:24:41.216486 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.216468 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:24:41.241530 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.241499 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8"] Apr 16 16:24:41.372281 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.372190 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdhs\" (UniqueName: \"kubernetes.io/projected/36c0655a-39bf-41ac-a093-9d9b0567949f-kube-api-access-nzdhs\") pod \"migrator-64d4d94569-98gh8\" (UID: \"36c0655a-39bf-41ac-a093-9d9b0567949f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" Apr 16 16:24:41.473203 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.473150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdhs\" (UniqueName: \"kubernetes.io/projected/36c0655a-39bf-41ac-a093-9d9b0567949f-kube-api-access-nzdhs\") pod \"migrator-64d4d94569-98gh8\" (UID: \"36c0655a-39bf-41ac-a093-9d9b0567949f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" Apr 16 16:24:41.484431 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.484401 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdhs\" (UniqueName: \"kubernetes.io/projected/36c0655a-39bf-41ac-a093-9d9b0567949f-kube-api-access-nzdhs\") pod \"migrator-64d4d94569-98gh8\" (UID: \"36c0655a-39bf-41ac-a093-9d9b0567949f\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" Apr 16 16:24:41.507699 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.507656 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" Apr 16 16:24:41.629411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:41.629306 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8"] Apr 16 16:24:41.632821 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:41.632782 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c0655a_39bf_41ac_a093_9d9b0567949f.slice/crio-7e69b2466546586c9ee72dbcdec8dbdc2dae8d83e1831b2ad726235ca0194581 WatchSource:0}: Error finding container 7e69b2466546586c9ee72dbcdec8dbdc2dae8d83e1831b2ad726235ca0194581: Status 404 returned error can't find the container with id 7e69b2466546586c9ee72dbcdec8dbdc2dae8d83e1831b2ad726235ca0194581 Apr 16 16:24:42.040591 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.040565 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:24:42.041060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.041005 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/0.log" Apr 16 16:24:42.041060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.041037 2571 generic.go:358] "Generic (PLEG): container finished" podID="2e38bc16-772b-49da-b705-b184cb60f9bd" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" exitCode=255 Apr 16 16:24:42.041167 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.041144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" event={"ID":"2e38bc16-772b-49da-b705-b184cb60f9bd","Type":"ContainerDied","Data":"2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c"} Apr 16 16:24:42.041221 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.041206 2571 scope.go:117] "RemoveContainer" containerID="6fffab25169c753382ac5ea095f706156802c4c73391da9a625904135bba4bf1" Apr 16 16:24:42.041447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.041424 2571 scope.go:117] "RemoveContainer" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" Apr 16 16:24:42.041646 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:42.041620 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-n874j_openshift-console-operator(2e38bc16-772b-49da-b705-b184cb60f9bd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podUID="2e38bc16-772b-49da-b705-b184cb60f9bd" Apr 16 16:24:42.042396 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:42.042168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" event={"ID":"36c0655a-39bf-41ac-a093-9d9b0567949f","Type":"ContainerStarted","Data":"7e69b2466546586c9ee72dbcdec8dbdc2dae8d83e1831b2ad726235ca0194581"} Apr 16 16:24:43.046738 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:43.046660 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:24:43.047185 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:43.047048 2571 scope.go:117] "RemoveContainer" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" Apr 16 16:24:43.047302 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:43.047273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-n874j_openshift-console-operator(2e38bc16-772b-49da-b705-b184cb60f9bd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podUID="2e38bc16-772b-49da-b705-b184cb60f9bd" Apr 16 16:24:43.048278 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:43.048257 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" event={"ID":"36c0655a-39bf-41ac-a093-9d9b0567949f","Type":"ContainerStarted","Data":"887030a8b343f9aca16377be08043b70e5464b0e9a7759d23f612c2a7c4a3e89"} Apr 16 16:24:43.048344 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:43.048287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" event={"ID":"36c0655a-39bf-41ac-a093-9d9b0567949f","Type":"ContainerStarted","Data":"fc419095c67c0212f4e265eded0117e637a6a086e191fb6e539b3188976d80d7"} Apr 16 16:24:43.085485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:43.085425 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-98gh8" podStartSLOduration=0.929601031 podStartE2EDuration="2.085408778s" podCreationTimestamp="2026-04-16 16:24:41 +0000 UTC" firstStartedPulling="2026-04-16 16:24:41.634702239 +0000 UTC m=+85.469555122" lastFinishedPulling="2026-04-16 16:24:42.790509985 +0000 UTC m=+86.625362869" observedRunningTime="2026-04-16 16:24:43.084146568 +0000 UTC m=+86.918999466" watchObservedRunningTime="2026-04-16 16:24:43.085408778 +0000 UTC m=+86.920261686" Apr 16 16:24:44.152468 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:44.152440 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9q4lm_037863f7-80dd-4ea9-9735-c27f4d903d1d/dns-node-resolver/0.log" Apr 16 16:24:44.698095 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:44.698059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:44.698292 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:44.698139 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:44.698292 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:44.698158 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:44.698292 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.698248 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:24:44.698292 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.698279 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.698266128 +0000 UTC m=+96.533119010 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : configmap references non-existent config key: service-ca.crt Apr 16 16:24:44.698427 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.698305 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs podName:23b07769-d51e-41eb-bd71-9c987916dbd8 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.69829198 +0000 UTC m=+96.533144862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs") pod "router-default-6db657d5cd-98k8d" (UID: "23b07769-d51e-41eb-bd71-9c987916dbd8") : secret "router-metrics-certs-default" not found Apr 16 16:24:44.698427 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.698248 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:44.698427 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.698347 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.698336351 +0000 UTC m=+96.533189250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:44.798591 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:44.798545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:44.798762 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.798702 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:44.798797 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:44.798772 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls podName:43a0dbe4-b395-4cbb-93b0-0918a13c59ec nodeName:}" failed. No retries permitted until 2026-04-16 16:24:52.798754443 +0000 UTC m=+96.633607326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-jl4xw" (UID: "43a0dbe4-b395-4cbb-93b0-0918a13c59ec") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:24:45.151625 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:45.151550 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7th94_93ea3158-fb50-45bd-a3ff-a9af8b130de9/node-ca/0.log" Apr 16 16:24:46.354642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:46.354614 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-98gh8_36c0655a-39bf-41ac-a093-9d9b0567949f/migrator/0.log" Apr 16 16:24:46.552391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:46.552366 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-98gh8_36c0655a-39bf-41ac-a093-9d9b0567949f/graceful-termination/0.log" Apr 16 16:24:47.079540 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:47.079498 2571 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:47.079907 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:47.079893 2571 scope.go:117] "RemoveContainer" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" Apr 16 16:24:47.080064 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:47.080048 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-n874j_openshift-console-operator(2e38bc16-772b-49da-b705-b184cb60f9bd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podUID="2e38bc16-772b-49da-b705-b184cb60f9bd" Apr 16 16:24:50.033227 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:50.033193 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:24:50.033596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:50.033574 2571 scope.go:117] "RemoveContainer" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" Apr 16 16:24:50.033771 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:50.033752 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-n874j_openshift-console-operator(2e38bc16-772b-49da-b705-b184cb60f9bd)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" podUID="2e38bc16-772b-49da-b705-b184cb60f9bd" Apr 16 16:24:52.763945 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.763910 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:24:52.763945 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.763951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:52.764411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.764008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:52.764411 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:52.764058 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 16:24:52.764411 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:24:52.764148 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert podName:b5859820-c3a7-4853-87f4-1a9946dbeaa1 nodeName:}" failed. No retries permitted until 2026-04-16 16:25:08.764130113 +0000 UTC m=+112.598982994 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-jzjhf" (UID: "b5859820-c3a7-4853-87f4-1a9946dbeaa1") : secret "networking-console-plugin-cert" not found Apr 16 16:24:52.764604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.764584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23b07769-d51e-41eb-bd71-9c987916dbd8-service-ca-bundle\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:52.766356 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.766338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/23b07769-d51e-41eb-bd71-9c987916dbd8-metrics-certs\") pod \"router-default-6db657d5cd-98k8d\" (UID: \"23b07769-d51e-41eb-bd71-9c987916dbd8\") " pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:52.799186 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.799143 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:52.864971 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.864936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:52.867243 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.867218 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/43a0dbe4-b395-4cbb-93b0-0918a13c59ec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-jl4xw\" (UID: \"43a0dbe4-b395-4cbb-93b0-0918a13c59ec\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:52.884466 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.884436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" Apr 16 16:24:52.937091 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:52.937058 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6db657d5cd-98k8d"] Apr 16 16:24:52.940768 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:52.940727 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23b07769_d51e_41eb_bd71_9c987916dbd8.slice/crio-e63d37ec6b7e1561a3eeb038d48b99b8dc310709e83d5c179eb9f7d8f211b6b5 WatchSource:0}: Error finding container e63d37ec6b7e1561a3eeb038d48b99b8dc310709e83d5c179eb9f7d8f211b6b5: Status 404 returned error can't find the container with id e63d37ec6b7e1561a3eeb038d48b99b8dc310709e83d5c179eb9f7d8f211b6b5 Apr 16 16:24:53.018681 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.018652 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw"] Apr 16 16:24:53.021815 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:53.021777 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a0dbe4_b395_4cbb_93b0_0918a13c59ec.slice/crio-bc12021005d4b50fb68303794da17aaa914fbd2ecb50385f8bfe211e6622632b WatchSource:0}: Error finding container bc12021005d4b50fb68303794da17aaa914fbd2ecb50385f8bfe211e6622632b: Status 404 returned error can't find the container with id bc12021005d4b50fb68303794da17aaa914fbd2ecb50385f8bfe211e6622632b Apr 16 16:24:53.074144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.074086 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" event={"ID":"43a0dbe4-b395-4cbb-93b0-0918a13c59ec","Type":"ContainerStarted","Data":"bc12021005d4b50fb68303794da17aaa914fbd2ecb50385f8bfe211e6622632b"} Apr 16 16:24:53.075414 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.075391 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6db657d5cd-98k8d" event={"ID":"23b07769-d51e-41eb-bd71-9c987916dbd8","Type":"ContainerStarted","Data":"23f3a060e818f94d9873fe683ad6915592419e3036cc457f9a9afe6ca9a5c4cd"} Apr 16 16:24:53.075414 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.075416 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6db657d5cd-98k8d" event={"ID":"23b07769-d51e-41eb-bd71-9c987916dbd8","Type":"ContainerStarted","Data":"e63d37ec6b7e1561a3eeb038d48b99b8dc310709e83d5c179eb9f7d8f211b6b5"} Apr 16 16:24:53.800320 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.800281 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:53.803186 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.803159 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:53.825746 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:53.825687 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6db657d5cd-98k8d" podStartSLOduration=17.825668894 podStartE2EDuration="17.825668894s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:24:53.111202204 +0000 UTC m=+96.946055108" watchObservedRunningTime="2026-04-16 16:24:53.825668894 +0000 UTC m=+97.660521800" Apr 16 16:24:54.078956 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.078872 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:54.080371 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.080320 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6db657d5cd-98k8d" Apr 16 16:24:54.377815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.377708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:24:54.380614 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.380586 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8d51d9d-8870-4740-83f4-ac61a0da4fee-metrics-tls\") pod \"dns-default-825fq\" (UID: \"a8d51d9d-8870-4740-83f4-ac61a0da4fee\") " pod="openshift-dns/dns-default-825fq" Apr 16 16:24:54.453916 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.453877 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2bd8s\"" Apr 16 16:24:54.462198 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.462168 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-825fq" Apr 16 16:24:54.478409 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.478374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:24:54.480906 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.480876 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6fe47973-f1d6-4d8b-bcfc-e729c9709d5f-cert\") pod \"ingress-canary-497mc\" (UID: \"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f\") " pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:24:54.633642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.633562 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-825fq"] Apr 16 16:24:54.637218 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:54.637173 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d51d9d_8870_4740_83f4_ac61a0da4fee.slice/crio-fc2bf554020767fb3b1c8b71e2802626540e906d436d885c6011f9af72a17567 WatchSource:0}: Error finding container fc2bf554020767fb3b1c8b71e2802626540e906d436d885c6011f9af72a17567: Status 404 returned error can't find the container with id fc2bf554020767fb3b1c8b71e2802626540e906d436d885c6011f9af72a17567 Apr 16 16:24:54.773803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.773771 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hrwds\"" Apr 16 16:24:54.781220 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.781191 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-497mc" Apr 16 16:24:54.913829 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:54.913794 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-497mc"] Apr 16 16:24:54.917217 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:24:54.917175 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe47973_f1d6_4d8b_bcfc_e729c9709d5f.slice/crio-1382bb8d557fbbebf0b4c6452631f3bd057205d9cac6af81369017bbdce61c9a WatchSource:0}: Error finding container 1382bb8d557fbbebf0b4c6452631f3bd057205d9cac6af81369017bbdce61c9a: Status 404 returned error can't find the container with id 1382bb8d557fbbebf0b4c6452631f3bd057205d9cac6af81369017bbdce61c9a Apr 16 16:24:55.083455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:55.083406 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" event={"ID":"43a0dbe4-b395-4cbb-93b0-0918a13c59ec","Type":"ContainerStarted","Data":"b7933f58f79e49d09188ba7d6c703af1fa7c5b2eca7ba4a9daf1680eacc7b73a"} Apr 16 16:24:55.084600 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:55.084570 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-825fq" event={"ID":"a8d51d9d-8870-4740-83f4-ac61a0da4fee","Type":"ContainerStarted","Data":"fc2bf554020767fb3b1c8b71e2802626540e906d436d885c6011f9af72a17567"} Apr 16 16:24:55.085924 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:55.085883 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-497mc" event={"ID":"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f","Type":"ContainerStarted","Data":"1382bb8d557fbbebf0b4c6452631f3bd057205d9cac6af81369017bbdce61c9a"} Apr 16 16:24:55.110834 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:55.110776 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-jl4xw" podStartSLOduration=17.588536378 podStartE2EDuration="19.110752569s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:24:53.023233199 +0000 UTC m=+96.858086084" lastFinishedPulling="2026-04-16 16:24:54.545449383 +0000 UTC m=+98.380302275" observedRunningTime="2026-04-16 16:24:55.10951898 +0000 UTC m=+98.944371884" watchObservedRunningTime="2026-04-16 16:24:55.110752569 +0000 UTC m=+98.945605473" Apr 16 16:24:56.092702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:56.092146 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-825fq" event={"ID":"a8d51d9d-8870-4740-83f4-ac61a0da4fee","Type":"ContainerStarted","Data":"70c3775ccfdfa71a9bd27ea281170de253c19c7145609b1da0aee0a11ac7782b"} Apr 16 16:24:57.001238 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.001206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9q8hb" Apr 16 16:24:57.096949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.096911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-825fq" event={"ID":"a8d51d9d-8870-4740-83f4-ac61a0da4fee","Type":"ContainerStarted","Data":"d3c0eacbd766253c3549ed467d84faa145ee32a2bee1ec6058bc18c35eaf2780"} Apr 16 16:24:57.097439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.097043 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-825fq" Apr 16 16:24:57.100343 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.100314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-497mc" event={"ID":"6fe47973-f1d6-4d8b-bcfc-e729c9709d5f","Type":"ContainerStarted","Data":"d95fd9c329b4b2f3691c61f84c695ea1922ed5bf8db2157ed468dd34d8b35ffa"} Apr 16 16:24:57.118550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.118500 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-825fq" podStartSLOduration=65.82821583 podStartE2EDuration="1m7.118485482s" podCreationTimestamp="2026-04-16 16:23:50 +0000 UTC" firstStartedPulling="2026-04-16 16:24:54.639309427 +0000 UTC m=+98.474162313" lastFinishedPulling="2026-04-16 16:24:55.929579079 +0000 UTC m=+99.764431965" observedRunningTime="2026-04-16 16:24:57.118110678 +0000 UTC m=+100.952963583" watchObservedRunningTime="2026-04-16 16:24:57.118485482 +0000 UTC m=+100.953338387" Apr 16 16:24:57.135717 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:24:57.135665 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-497mc" podStartSLOduration=65.435174795 podStartE2EDuration="1m7.135651257s" podCreationTimestamp="2026-04-16 16:23:50 +0000 UTC" firstStartedPulling="2026-04-16 16:24:54.919542513 +0000 UTC m=+98.754395395" lastFinishedPulling="2026-04-16 16:24:56.620018975 +0000 UTC m=+100.454871857" observedRunningTime="2026-04-16 16:24:57.134733051 +0000 UTC m=+100.969585966" watchObservedRunningTime="2026-04-16 16:24:57.135651257 +0000 UTC m=+100.970504161" Apr 16 16:25:02.988367 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:02.988328 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vcnhj"] Apr 16 16:25:02.993729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:02.993707 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:02.997517 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:02.997492 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:25:02.997635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:02.997521 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-t6bvr\"" Apr 16 16:25:02.997635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:02.997492 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:25:03.020787 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.020760 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vcnhj"] Apr 16 16:25:03.043426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.043391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj8f\" (UniqueName: \"kubernetes.io/projected/2b669da5-f0be-47de-9e64-383b411f4607-kube-api-access-rnj8f\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.043589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.043518 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b669da5-f0be-47de-9e64-383b411f4607-crio-socket\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.043589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.043545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b669da5-f0be-47de-9e64-383b411f4607-data-volume\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.043704 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.043639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b669da5-f0be-47de-9e64-383b411f4607-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.043704 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.043674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b669da5-f0be-47de-9e64-383b411f4607-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.144599 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b669da5-f0be-47de-9e64-383b411f4607-crio-socket\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.144599 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144605 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b669da5-f0be-47de-9e64-383b411f4607-data-volume\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b669da5-f0be-47de-9e64-383b411f4607-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b669da5-f0be-47de-9e64-383b411f4607-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144708 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj8f\" (UniqueName: \"kubernetes.io/projected/2b669da5-f0be-47de-9e64-383b411f4607-kube-api-access-rnj8f\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.144718 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2b669da5-f0be-47de-9e64-383b411f4607-crio-socket\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.145023 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2b669da5-f0be-47de-9e64-383b411f4607-data-volume\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.145892 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.145871 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2b669da5-f0be-47de-9e64-383b411f4607-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.147082 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.147055 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2b669da5-f0be-47de-9e64-383b411f4607-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.163386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.163361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj8f\" (UniqueName: \"kubernetes.io/projected/2b669da5-f0be-47de-9e64-383b411f4607-kube-api-access-rnj8f\") pod \"insights-runtime-extractor-vcnhj\" (UID: \"2b669da5-f0be-47de-9e64-383b411f4607\") " pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.302895 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.302793 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vcnhj" Apr 16 16:25:03.428762 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.428729 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vcnhj"] Apr 16 16:25:03.431614 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:03.431582 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b669da5_f0be_47de_9e64_383b411f4607.slice/crio-7fba9e0eb71efffcd111e367d0170c82aebcbcb793a65c51f3d7f1f8351c1f03 WatchSource:0}: Error finding container 7fba9e0eb71efffcd111e367d0170c82aebcbcb793a65c51f3d7f1f8351c1f03: Status 404 returned error can't find the container with id 7fba9e0eb71efffcd111e367d0170c82aebcbcb793a65c51f3d7f1f8351c1f03 Apr 16 16:25:03.750965 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:03.750931 2571 scope.go:117] "RemoveContainer" containerID="2b6ab8d773533cc7f283adb215bcfadc1a4c4f2512d07366e2cc813daf24f36c" Apr 16 16:25:04.119802 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.119708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vcnhj" event={"ID":"2b669da5-f0be-47de-9e64-383b411f4607","Type":"ContainerStarted","Data":"b9c81852957e58d3ce17cbbe66e7ca915f16b35b6535965c46e4da4e76cd6ff8"} Apr 16 16:25:04.119802 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.119760 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vcnhj" event={"ID":"2b669da5-f0be-47de-9e64-383b411f4607","Type":"ContainerStarted","Data":"7fba9e0eb71efffcd111e367d0170c82aebcbcb793a65c51f3d7f1f8351c1f03"} Apr 16 16:25:04.121662 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.121634 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:25:04.121795 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.121706 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" event={"ID":"2e38bc16-772b-49da-b705-b184cb60f9bd","Type":"ContainerStarted","Data":"5c05e3cc2e035decb359853f90f818915f41a733322489bb819dde1a4b8cce33"} Apr 16 16:25:04.122096 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.122042 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:25:04.555785 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.555754 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-n874j" Apr 16 16:25:04.796713 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.796676 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-q95n6"] Apr 16 16:25:04.799893 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.799874 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:04.802644 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.802620 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:25:04.802789 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.802724 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-9wq2b\"" Apr 16 16:25:04.806703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.806679 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:25:04.816699 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.816671 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-q95n6"] Apr 16 16:25:04.858357 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.858318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rfhq\" (UniqueName: \"kubernetes.io/projected/3823e5d2-5cbd-4e2a-bf95-dacceea78679-kube-api-access-6rfhq\") pod \"downloads-586b57c7b4-q95n6\" (UID: \"3823e5d2-5cbd-4e2a-bf95-dacceea78679\") " pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:04.958765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.958733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rfhq\" (UniqueName: \"kubernetes.io/projected/3823e5d2-5cbd-4e2a-bf95-dacceea78679-kube-api-access-6rfhq\") pod \"downloads-586b57c7b4-q95n6\" (UID: \"3823e5d2-5cbd-4e2a-bf95-dacceea78679\") " pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:04.967880 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:04.967845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rfhq\" (UniqueName: \"kubernetes.io/projected/3823e5d2-5cbd-4e2a-bf95-dacceea78679-kube-api-access-6rfhq\") pod \"downloads-586b57c7b4-q95n6\" (UID: \"3823e5d2-5cbd-4e2a-bf95-dacceea78679\") " pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:05.109250 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:05.109156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:05.126041 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:05.126011 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vcnhj" event={"ID":"2b669da5-f0be-47de-9e64-383b411f4607","Type":"ContainerStarted","Data":"7a66f6dac38dbaa802578f64b5f162138377255a96d64efbbb33f724130b6b60"} Apr 16 16:25:05.236169 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:05.236132 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-q95n6"] Apr 16 16:25:05.240512 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:05.240467 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3823e5d2_5cbd_4e2a_bf95_dacceea78679.slice/crio-4051407d0ef5b2fce7f75ea228894ebc4fb814e37ef68aae6085d3f890729c7f WatchSource:0}: Error finding container 4051407d0ef5b2fce7f75ea228894ebc4fb814e37ef68aae6085d3f890729c7f: Status 404 returned error can't find the container with id 4051407d0ef5b2fce7f75ea228894ebc4fb814e37ef68aae6085d3f890729c7f Apr 16 16:25:06.130409 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:06.130363 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-q95n6" event={"ID":"3823e5d2-5cbd-4e2a-bf95-dacceea78679","Type":"ContainerStarted","Data":"4051407d0ef5b2fce7f75ea228894ebc4fb814e37ef68aae6085d3f890729c7f"} Apr 16 16:25:07.105278 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:07.105244 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-825fq" Apr 16 16:25:07.137106 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:07.137074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vcnhj" event={"ID":"2b669da5-f0be-47de-9e64-383b411f4607","Type":"ContainerStarted","Data":"d3238156a09b7dd855734671a6ecd42cdfc32c13f5e128de5f78e9e4db6d0b3a"} Apr 16 16:25:07.156754 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:07.156700 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vcnhj" podStartSLOduration=2.431272628 podStartE2EDuration="5.15668431s" podCreationTimestamp="2026-04-16 16:25:02 +0000 UTC" firstStartedPulling="2026-04-16 16:25:03.485588606 +0000 UTC m=+107.320441489" lastFinishedPulling="2026-04-16 16:25:06.211000271 +0000 UTC m=+110.045853171" observedRunningTime="2026-04-16 16:25:07.155336493 +0000 UTC m=+110.990189427" watchObservedRunningTime="2026-04-16 16:25:07.15668431 +0000 UTC m=+110.991537214" Apr 16 16:25:08.795445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:08.795397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:25:08.798155 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:08.798097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b5859820-c3a7-4853-87f4-1a9946dbeaa1-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-jzjhf\" (UID: \"b5859820-c3a7-4853-87f4-1a9946dbeaa1\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:25:08.990248 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:08.990199 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" Apr 16 16:25:09.159937 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:09.159904 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf"] Apr 16 16:25:09.164002 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:09.163965 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5859820_c3a7_4853_87f4_1a9946dbeaa1.slice/crio-d685678490120fe6f776dc9c1962fa9532d6043c5cf3806df1b39b4ac4d16faf WatchSource:0}: Error finding container d685678490120fe6f776dc9c1962fa9532d6043c5cf3806df1b39b4ac4d16faf: Status 404 returned error can't find the container with id d685678490120fe6f776dc9c1962fa9532d6043c5cf3806df1b39b4ac4d16faf Apr 16 16:25:10.147891 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.147852 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" event={"ID":"b5859820-c3a7-4853-87f4-1a9946dbeaa1","Type":"ContainerStarted","Data":"d685678490120fe6f776dc9c1962fa9532d6043c5cf3806df1b39b4ac4d16faf"} Apr 16 16:25:10.682167 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.682134 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:10.703542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.703511 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:10.703681 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.703650 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.707776 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.707754 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:25:10.707776 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.707774 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:25:10.707973 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.707785 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2dz8r\"" Apr 16 16:25:10.708176 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.708159 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:25:10.708272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.708193 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:25:10.708418 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.708402 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:25:10.813724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.813926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.813926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813769 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.813926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.814060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcbm\" (UniqueName: \"kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.814060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.813976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915058 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcbm\" (UniqueName: \"kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.915277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.915203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.916037 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.916007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.916037 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.916025 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.916204 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.916026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.917985 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.917950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.918087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.917992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:10.924102 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:10.924067 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcbm\" (UniqueName: \"kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm\") pod \"console-896c984d-8djv5\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:11.027070 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.027030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:11.152868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.152824 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" event={"ID":"b5859820-c3a7-4853-87f4-1a9946dbeaa1","Type":"ContainerStarted","Data":"8f9906457450097d399756ec4dbcea12bdfb9191ac96e8a9551a48ba6ca9cc5f"} Apr 16 16:25:11.165796 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.165712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:11.168976 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:11.168930 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a47cca3_ba8b_4e47_a1a7_6e586fc25037.slice/crio-e1b31a77180cce95098c48a7b59cc38b697dc93e963b26108e05257f6475dd45 WatchSource:0}: Error finding container e1b31a77180cce95098c48a7b59cc38b697dc93e963b26108e05257f6475dd45: Status 404 returned error can't find the container with id e1b31a77180cce95098c48a7b59cc38b697dc93e963b26108e05257f6475dd45 Apr 16 16:25:11.170269 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.170216 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-jzjhf" podStartSLOduration=33.743161819 podStartE2EDuration="35.170200808s" podCreationTimestamp="2026-04-16 16:24:36 +0000 UTC" firstStartedPulling="2026-04-16 16:25:09.166231441 +0000 UTC m=+113.001084328" lastFinishedPulling="2026-04-16 16:25:10.59327042 +0000 UTC m=+114.428123317" observedRunningTime="2026-04-16 16:25:11.169175489 +0000 UTC m=+115.004028397" watchObservedRunningTime="2026-04-16 16:25:11.170200808 +0000 UTC m=+115.005053711" Apr 16 16:25:11.657353 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.657315 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7kmlg"] Apr 16 16:25:11.662716 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.662688 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.667540 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.667489 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:25:11.667540 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.667496 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:25:11.667797 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.667776 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:25:11.668270 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.668250 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mp6v2\"" Apr 16 16:25:11.669434 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.669415 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:25:11.721593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721622 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtxsb\" (UniqueName: \"kubernetes.io/projected/5177c104-80c5-4f14-b607-d2d272ec4b4a-kube-api-access-wtxsb\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-textfile\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721726 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-root\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721750 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-wtmp\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721825 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-sys\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.721969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.721903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-metrics-client-ca\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-metrics-client-ca\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822581 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtxsb\" (UniqueName: \"kubernetes.io/projected/5177c104-80c5-4f14-b607-d2d272ec4b4a-kube-api-access-wtxsb\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822668 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-textfile\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-root\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822721 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-wtmp\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822794 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-sys\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.822968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.822882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-sys\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.823316 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.823221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-wtmp\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.823316 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.823289 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5177c104-80c5-4f14-b607-d2d272ec4b4a-root\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.823491 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.823466 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-metrics-client-ca\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.823684 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:25:11.823666 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:11.823768 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:25:11.823749 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls podName:5177c104-80c5-4f14-b607-d2d272ec4b4a nodeName:}" failed. No retries permitted until 2026-04-16 16:25:12.323730072 +0000 UTC m=+116.158582971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls") pod "node-exporter-7kmlg" (UID: "5177c104-80c5-4f14-b607-d2d272ec4b4a") : secret "node-exporter-tls" not found Apr 16 16:25:11.824132 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.824053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-textfile\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.824229 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.824103 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-accelerators-collector-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.827389 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.827367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:11.833282 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:11.833257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtxsb\" (UniqueName: \"kubernetes.io/projected/5177c104-80c5-4f14-b607-d2d272ec4b4a-kube-api-access-wtxsb\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:12.157081 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.157041 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-896c984d-8djv5" event={"ID":"0a47cca3-ba8b-4e47-a1a7-6e586fc25037","Type":"ContainerStarted","Data":"e1b31a77180cce95098c48a7b59cc38b697dc93e963b26108e05257f6475dd45"} Apr 16 16:25:12.327198 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.327149 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:12.327382 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:25:12.327316 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:25:12.327447 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:25:12.327404 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls podName:5177c104-80c5-4f14-b607-d2d272ec4b4a nodeName:}" failed. No retries permitted until 2026-04-16 16:25:13.327378941 +0000 UTC m=+117.162231842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls") pod "node-exporter-7kmlg" (UID: "5177c104-80c5-4f14-b607-d2d272ec4b4a") : secret "node-exporter-tls" not found Apr 16 16:25:12.711225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.711170 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:25:12.715858 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.715830 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721244 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721353 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nzr72\"" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721524 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721244 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:25:12.721718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.721704 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:25:12.722433 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.722304 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:25:12.722433 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.722329 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:25:12.722433 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.722348 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:25:12.722947 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.722791 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:25:12.739319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.739263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:25:12.831763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831716 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.831944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831772 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.831944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831802 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.831944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831828 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.831944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9694\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.831944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.831974 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.832012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.832048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.832094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.832156 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.832237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.832181 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933361 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933326 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9694\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933525 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933749 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.933821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.933776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.934238 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.934204 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.935955 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.935709 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.936381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.936139 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.939878 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.939858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.941271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.940823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.941271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.941177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.941271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.941183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.942005 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.941904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.942866 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.942778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.943646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.943621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.943736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.943656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.944638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.944611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9694\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:12.945095 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:12.945074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:13.029898 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.029771 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:25:13.201467 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.200847 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:25:13.338188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.338084 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:13.341055 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.341000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5177c104-80c5-4f14-b607-d2d272ec4b4a-node-exporter-tls\") pod \"node-exporter-7kmlg\" (UID: \"5177c104-80c5-4f14-b607-d2d272ec4b4a\") " pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:13.474258 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.474221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7kmlg" Apr 16 16:25:13.640204 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.640100 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67b7f774bc-wlc24"] Apr 16 16:25:13.645543 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.645515 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.649328 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.648846 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 16:25:13.649328 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.648846 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 16:25:13.649519 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.649420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 16:25:13.649519 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.649437 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 16:25:13.651373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.650649 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 16:25:13.651373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.650738 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-k44gj\"" Apr 16 16:25:13.651373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.650893 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-3s2j2odjs2b0l\"" Apr 16 16:25:13.656216 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.656193 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67b7f774bc-wlc24"] Apr 16 16:25:13.742201 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742083 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-grpc-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742201 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742201 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9b9c6d2-d1b2-4297-970e-6174d592cccd-metrics-client-ca\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742310 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742363 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742423 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghvm\" (UniqueName: \"kubernetes.io/projected/d9b9c6d2-d1b2-4297-970e-6174d592cccd-kube-api-access-bghvm\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.742501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.742489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bghvm\" (UniqueName: \"kubernetes.io/projected/d9b9c6d2-d1b2-4297-970e-6174d592cccd-kube-api-access-bghvm\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843284 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843334 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-grpc-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843405 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9b9c6d2-d1b2-4297-970e-6174d592cccd-metrics-client-ca\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.843547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.843446 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.845443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.845007 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9b9c6d2-d1b2-4297-970e-6174d592cccd-metrics-client-ca\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.846521 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.846491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.847007 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.846979 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.847543 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.847516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.848816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.848714 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.848816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.848771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-grpc-tls\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.848972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.848904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d9b9c6d2-d1b2-4297-970e-6174d592cccd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.854473 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.854449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghvm\" (UniqueName: \"kubernetes.io/projected/d9b9c6d2-d1b2-4297-970e-6174d592cccd-kube-api-access-bghvm\") pod \"thanos-querier-67b7f774bc-wlc24\" (UID: \"d9b9c6d2-d1b2-4297-970e-6174d592cccd\") " pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:13.960261 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:13.960163 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:14.378440 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:14.378364 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0cb5df_2680_43a9_99ad_c833dbfe3a7a.slice/crio-50a6df6e4002f46676a8136ad33c367a1e837064f2271c67d8dc97c235c854be WatchSource:0}: Error finding container 50a6df6e4002f46676a8136ad33c367a1e837064f2271c67d8dc97c235c854be: Status 404 returned error can't find the container with id 50a6df6e4002f46676a8136ad33c367a1e837064f2271c67d8dc97c235c854be Apr 16 16:25:14.390674 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:14.390606 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5177c104_80c5_4f14_b607_d2d272ec4b4a.slice/crio-057e0be3b722cc5017d625d6d1f1efee25a2fdbbaefd37c140ce6831a6cde1d3 WatchSource:0}: Error finding container 057e0be3b722cc5017d625d6d1f1efee25a2fdbbaefd37c140ce6831a6cde1d3: Status 404 returned error can't find the container with id 057e0be3b722cc5017d625d6d1f1efee25a2fdbbaefd37c140ce6831a6cde1d3 Apr 16 16:25:14.540697 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:14.540628 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67b7f774bc-wlc24"] Apr 16 16:25:14.544423 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:14.544384 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b9c6d2_d1b2_4297_970e_6174d592cccd.slice/crio-2bd4a46038d9ee5742e23bc03ae68512b405bb854cab4cd34c8ca766b4b79168 WatchSource:0}: Error finding container 2bd4a46038d9ee5742e23bc03ae68512b405bb854cab4cd34c8ca766b4b79168: Status 404 returned error can't find the container with id 2bd4a46038d9ee5742e23bc03ae68512b405bb854cab4cd34c8ca766b4b79168 Apr 16 16:25:15.168426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:15.168351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kmlg" event={"ID":"5177c104-80c5-4f14-b607-d2d272ec4b4a","Type":"ContainerStarted","Data":"057e0be3b722cc5017d625d6d1f1efee25a2fdbbaefd37c140ce6831a6cde1d3"} Apr 16 16:25:15.169710 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:15.169675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"50a6df6e4002f46676a8136ad33c367a1e837064f2271c67d8dc97c235c854be"} Apr 16 16:25:15.171282 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:15.171239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-896c984d-8djv5" event={"ID":"0a47cca3-ba8b-4e47-a1a7-6e586fc25037","Type":"ContainerStarted","Data":"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013"} Apr 16 16:25:15.172835 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:15.172798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"2bd4a46038d9ee5742e23bc03ae68512b405bb854cab4cd34c8ca766b4b79168"} Apr 16 16:25:15.191035 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:15.190984 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-896c984d-8djv5" podStartSLOduration=1.928239369 podStartE2EDuration="5.190954521s" podCreationTimestamp="2026-04-16 16:25:10 +0000 UTC" firstStartedPulling="2026-04-16 16:25:11.171320209 +0000 UTC m=+115.006173091" lastFinishedPulling="2026-04-16 16:25:14.434035359 +0000 UTC m=+118.268888243" observedRunningTime="2026-04-16 16:25:15.189333817 +0000 UTC m=+119.024186723" watchObservedRunningTime="2026-04-16 16:25:15.190954521 +0000 UTC m=+119.025807426" Apr 16 16:25:16.192968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.192934 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-575cb7dcdc-lzr8k"] Apr 16 16:25:16.198672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.198646 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.203152 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.203061 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-dksnb\"" Apr 16 16:25:16.203152 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.203089 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:25:16.203369 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.203174 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 16:25:16.203601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.203582 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-59s854fhdut5p\"" Apr 16 16:25:16.204454 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.204436 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 16:25:16.204692 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.204658 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 16:25:16.216728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.216699 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-575cb7dcdc-lzr8k"] Apr 16 16:25:16.267078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.266976 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpll7\" (UniqueName: \"kubernetes.io/projected/c972e32f-ab16-4627-aca1-2b89acfd43f7-kube-api-access-kpll7\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267051 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c972e32f-ab16-4627-aca1-2b89acfd43f7-audit-log\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267153 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-metrics-server-audit-profiles\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-client-certs\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-tls\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.267580 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.267377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-client-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368471 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368436 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-metrics-server-audit-profiles\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368471 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-client-certs\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368563 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-tls\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368587 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-client-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpll7\" (UniqueName: \"kubernetes.io/projected/c972e32f-ab16-4627-aca1-2b89acfd43f7-kube-api-access-kpll7\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.368904 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.368868 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c972e32f-ab16-4627-aca1-2b89acfd43f7-audit-log\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.369309 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.369281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c972e32f-ab16-4627-aca1-2b89acfd43f7-audit-log\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.369480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.369429 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.369665 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.369643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c972e32f-ab16-4627-aca1-2b89acfd43f7-metrics-server-audit-profiles\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.371638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.371612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-client-certs\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.371861 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.371844 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-secret-metrics-server-tls\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.372002 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.371976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c972e32f-ab16-4627-aca1-2b89acfd43f7-client-ca-bundle\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.391878 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.391851 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpll7\" (UniqueName: \"kubernetes.io/projected/c972e32f-ab16-4627-aca1-2b89acfd43f7-kube-api-access-kpll7\") pod \"metrics-server-575cb7dcdc-lzr8k\" (UID: \"c972e32f-ab16-4627-aca1-2b89acfd43f7\") " pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.405933 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.405901 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t"] Apr 16 16:25:16.411794 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.411766 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:16.418077 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.418050 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dncg9\"" Apr 16 16:25:16.419485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.419346 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:25:16.446898 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.446806 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t"] Apr 16 16:25:16.510554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.510516 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:16.570624 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.570578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/615e1cf5-bc9c-4926-ab18-adea83d0889c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-xt52t\" (UID: \"615e1cf5-bc9c-4926-ab18-adea83d0889c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:16.671682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.671644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/615e1cf5-bc9c-4926-ab18-adea83d0889c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-xt52t\" (UID: \"615e1cf5-bc9c-4926-ab18-adea83d0889c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:16.674755 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.674721 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:25:16.684859 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.684823 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/615e1cf5-bc9c-4926-ab18-adea83d0889c-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-xt52t\" (UID: \"615e1cf5-bc9c-4926-ab18-adea83d0889c\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:16.724388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.724314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dncg9\"" Apr 16 16:25:16.733051 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.733022 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:16.816558 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.816524 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56658d986b-vlstz"] Apr 16 16:25:16.822507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.822480 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.825225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825175 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 16:25:16.825359 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825237 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6qfzv\"" Apr 16 16:25:16.825359 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 16:25:16.825559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825543 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 16:25:16.825643 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825614 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 16:25:16.825836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.825819 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 16:25:16.833524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.833493 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 16:25:16.834695 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.834672 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56658d986b-vlstz"] Apr 16 16:25:16.974974 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.974888 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.974974 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.974933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-federate-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.974974 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.974962 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-metrics-client-ca\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.975255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.975012 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-serving-certs-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.975255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.975031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.975255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.975126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.975255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.975178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:16.975255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:16.975247 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g6kh\" (UniqueName: \"kubernetes.io/projected/d6bc6542-5e72-45d7-90bd-5b1414a1404d-kube-api-access-8g6kh\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.075815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.075778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-metrics-client-ca\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.075837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-serving-certs-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.075867 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.075930 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.075966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g6kh\" (UniqueName: \"kubernetes.io/projected/d6bc6542-5e72-45d7-90bd-5b1414a1404d-kube-api-access-8g6kh\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076298 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076298 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076099 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-federate-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076856 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-serving-certs-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076856 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-metrics-client-ca\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.076856 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.076833 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.078966 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.078943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-telemeter-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.079217 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.079190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.079310 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.079292 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-secret-telemeter-client\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.079468 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.079445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d6bc6542-5e72-45d7-90bd-5b1414a1404d-federate-client-tls\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.089812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.089781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g6kh\" (UniqueName: \"kubernetes.io/projected/d6bc6542-5e72-45d7-90bd-5b1414a1404d-kube-api-access-8g6kh\") pod \"telemeter-client-56658d986b-vlstz\" (UID: \"d6bc6542-5e72-45d7-90bd-5b1414a1404d\") " pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:17.135324 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:17.135279 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" Apr 16 16:25:19.334406 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.334370 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:25:19.340038 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.340010 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.348909 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.348686 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:25:19.362679 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.362648 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:25:19.498099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498107 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwb8w\" (UniqueName: \"kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.498364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.498349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599486 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwb8w\" (UniqueName: \"kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599561 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.599848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.599747 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.600719 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.600632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.601074 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.601044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.601277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.601247 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.601410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.601345 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.602505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.602487 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.602584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.602514 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.612301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.612276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwb8w\" (UniqueName: \"kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w\") pod \"console-7f5ccf797f-x7vnb\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:19.652200 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:19.652158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:21.027688 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:21.027650 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:21.027688 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:21.027694 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:21.032991 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:21.032965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:21.193797 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:21.193769 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:22.633324 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:22.633242 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56658d986b-vlstz"] Apr 16 16:25:22.649405 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:22.649371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-575cb7dcdc-lzr8k"] Apr 16 16:25:22.651855 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:22.651806 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bc6542_5e72_45d7_90bd_5b1414a1404d.slice/crio-654ad74620f645db024ab8d9d1201992b6c21e051f078184814a530aa18907b0 WatchSource:0}: Error finding container 654ad74620f645db024ab8d9d1201992b6c21e051f078184814a530aa18907b0: Status 404 returned error can't find the container with id 654ad74620f645db024ab8d9d1201992b6c21e051f078184814a530aa18907b0 Apr 16 16:25:22.654797 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:22.654771 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc972e32f_ab16_4627_aca1_2b89acfd43f7.slice/crio-c7b54600ff365c89e658037bf7cd5e33646e674fa6ac1c2e7428b05ac83b341a WatchSource:0}: Error finding container c7b54600ff365c89e658037bf7cd5e33646e674fa6ac1c2e7428b05ac83b341a: Status 404 returned error can't find the container with id c7b54600ff365c89e658037bf7cd5e33646e674fa6ac1c2e7428b05ac83b341a Apr 16 16:25:22.896822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:22.896712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t"] Apr 16 16:25:22.901265 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:22.901224 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod615e1cf5_bc9c_4926_ab18_adea83d0889c.slice/crio-95ea062dc38d6637144eaa5cfacef1ba7884e4d79f4ed035009ebcdece5cb73a WatchSource:0}: Error finding container 95ea062dc38d6637144eaa5cfacef1ba7884e4d79f4ed035009ebcdece5cb73a: Status 404 returned error can't find the container with id 95ea062dc38d6637144eaa5cfacef1ba7884e4d79f4ed035009ebcdece5cb73a Apr 16 16:25:22.905848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:22.905404 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:25:22.911533 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:22.911498 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod761ea9d6_b520_497e_bcee_4fb188760ba1.slice/crio-df72a5f70d7605629e58b068d92a902dc61f321eea44c98c1911a402c906d408 WatchSource:0}: Error finding container df72a5f70d7605629e58b068d92a902dc61f321eea44c98c1911a402c906d408: Status 404 returned error can't find the container with id df72a5f70d7605629e58b068d92a902dc61f321eea44c98c1911a402c906d408 Apr 16 16:25:23.199169 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.199091 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-q95n6" event={"ID":"3823e5d2-5cbd-4e2a-bf95-dacceea78679","Type":"ContainerStarted","Data":"4efc87dc5df6aabc243a756e06d3711f4921eb1cf7f82cfa590eb7156c9ec209"} Apr 16 16:25:23.199692 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.199657 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:23.202746 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.202717 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5ccf797f-x7vnb" event={"ID":"761ea9d6-b520-497e-bcee-4fb188760ba1","Type":"ContainerStarted","Data":"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3"} Apr 16 16:25:23.202876 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.202754 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5ccf797f-x7vnb" event={"ID":"761ea9d6-b520-497e-bcee-4fb188760ba1","Type":"ContainerStarted","Data":"df72a5f70d7605629e58b068d92a902dc61f321eea44c98c1911a402c906d408"} Apr 16 16:25:23.204698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.204500 2571 generic.go:358] "Generic (PLEG): container finished" podID="5177c104-80c5-4f14-b607-d2d272ec4b4a" containerID="004f19d2d410f887571ff5c5b26f4fd5b3410da50f149800c801ceabf7b549ae" exitCode=0 Apr 16 16:25:23.204698 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.204582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kmlg" event={"ID":"5177c104-80c5-4f14-b607-d2d272ec4b4a","Type":"ContainerDied","Data":"004f19d2d410f887571ff5c5b26f4fd5b3410da50f149800c801ceabf7b549ae"} Apr 16 16:25:23.206297 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.206239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" event={"ID":"c972e32f-ab16-4627-aca1-2b89acfd43f7","Type":"ContainerStarted","Data":"c7b54600ff365c89e658037bf7cd5e33646e674fa6ac1c2e7428b05ac83b341a"} Apr 16 16:25:23.207911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.207856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" event={"ID":"615e1cf5-bc9c-4926-ab18-adea83d0889c","Type":"ContainerStarted","Data":"95ea062dc38d6637144eaa5cfacef1ba7884e4d79f4ed035009ebcdece5cb73a"} Apr 16 16:25:23.210648 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.210624 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-q95n6" Apr 16 16:25:23.210987 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.210887 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18" exitCode=0 Apr 16 16:25:23.210987 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.210958 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18"} Apr 16 16:25:23.212949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.212919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" event={"ID":"d6bc6542-5e72-45d7-90bd-5b1414a1404d","Type":"ContainerStarted","Data":"654ad74620f645db024ab8d9d1201992b6c21e051f078184814a530aa18907b0"} Apr 16 16:25:23.259028 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.256321 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-q95n6" podStartSLOduration=2.052322273 podStartE2EDuration="19.256300265s" podCreationTimestamp="2026-04-16 16:25:04 +0000 UTC" firstStartedPulling="2026-04-16 16:25:05.242304575 +0000 UTC m=+109.077157460" lastFinishedPulling="2026-04-16 16:25:22.446282563 +0000 UTC m=+126.281135452" observedRunningTime="2026-04-16 16:25:23.222851105 +0000 UTC m=+127.057704036" watchObservedRunningTime="2026-04-16 16:25:23.256300265 +0000 UTC m=+127.091153172" Apr 16 16:25:23.324740 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:23.323882 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5ccf797f-x7vnb" podStartSLOduration=4.323860878 podStartE2EDuration="4.323860878s" podCreationTimestamp="2026-04-16 16:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:23.322412977 +0000 UTC m=+127.157265885" watchObservedRunningTime="2026-04-16 16:25:23.323860878 +0000 UTC m=+127.158713784" Apr 16 16:25:24.222623 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:24.222150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kmlg" event={"ID":"5177c104-80c5-4f14-b607-d2d272ec4b4a","Type":"ContainerStarted","Data":"9f7eb2b29456806b8119c6098feec3fb82f131590a9fb93ecbb28acc874d4316"} Apr 16 16:25:26.480993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:26.480898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:25:26.484712 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:26.484650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/073e645f-92a9-4855-9057-6a125ec9ebda-metrics-certs\") pod \"network-metrics-daemon-mtw25\" (UID: \"073e645f-92a9-4855-9057-6a125ec9ebda\") " pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:25:26.563247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:26.563207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pm5r6\"" Apr 16 16:25:26.570441 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:26.570407 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mtw25" Apr 16 16:25:27.773589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:27.773328 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mtw25"] Apr 16 16:25:27.776557 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:27.776283 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073e645f_92a9_4855_9057_6a125ec9ebda.slice/crio-3abaa80399f8aac42749ddbed7187566a143c08989ad4874c235d4b08b109576 WatchSource:0}: Error finding container 3abaa80399f8aac42749ddbed7187566a143c08989ad4874c235d4b08b109576: Status 404 returned error can't find the container with id 3abaa80399f8aac42749ddbed7187566a143c08989ad4874c235d4b08b109576 Apr 16 16:25:28.237560 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.237524 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" event={"ID":"615e1cf5-bc9c-4926-ab18-adea83d0889c","Type":"ContainerStarted","Data":"b641972526602d048db845a3eefd11a8e2c16bc1a79b248c1d016c5955dbee85"} Apr 16 16:25:28.237922 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.237896 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:28.240277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.240252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec"} Apr 16 16:25:28.240277 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.240282 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd"} Apr 16 16:25:28.240479 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.240296 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4"} Apr 16 16:25:28.242336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.242308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" event={"ID":"d6bc6542-5e72-45d7-90bd-5b1414a1404d","Type":"ContainerStarted","Data":"56bcf3465390685cdbd3b1d00ed0c6ab65134664909ca50ede6045744b3d0e25"} Apr 16 16:25:28.242455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.242341 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" event={"ID":"d6bc6542-5e72-45d7-90bd-5b1414a1404d","Type":"ContainerStarted","Data":"a96b121b0a777b212d750b44f4e37808db91bde33025b31879bfe79c087ab50c"} Apr 16 16:25:28.242455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.242359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" event={"ID":"d6bc6542-5e72-45d7-90bd-5b1414a1404d","Type":"ContainerStarted","Data":"39d13c2d6bab6b4aca14eed813767af2dede92d12c2377cfd96b77f5e373a1e2"} Apr 16 16:25:28.244400 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.244372 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" Apr 16 16:25:28.244964 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.244913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"1f3231b9a3392c2e698de515f6fe934b673b929c280cd3a4e0ee54d6de0f8e20"} Apr 16 16:25:28.244964 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.244953 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"d9ad81ec20fa2ad687d2775e9277d1c898fa4f02a9c61f982ac5532098d759a7"} Apr 16 16:25:28.245105 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.244968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"ca615d379d36ac615e584ea47f9449da8d04a8330e9b633e73a232cf38581716"} Apr 16 16:25:28.246142 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.246093 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtw25" event={"ID":"073e645f-92a9-4855-9057-6a125ec9ebda","Type":"ContainerStarted","Data":"3abaa80399f8aac42749ddbed7187566a143c08989ad4874c235d4b08b109576"} Apr 16 16:25:28.248495 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.248474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7kmlg" event={"ID":"5177c104-80c5-4f14-b607-d2d272ec4b4a","Type":"ContainerStarted","Data":"0b27a971bf2b2ae376d073fe415d002e26d06885fac974af2a4ba5b0c0072331"} Apr 16 16:25:28.249893 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.249848 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" event={"ID":"c972e32f-ab16-4627-aca1-2b89acfd43f7","Type":"ContainerStarted","Data":"0acf4782b741c8a70af5003990a62dc036cf0e3fa8483f2a2505975f9142fafc"} Apr 16 16:25:28.294881 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.294821 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-xt52t" podStartSLOduration=7.593139304 podStartE2EDuration="12.294799239s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:22.907430463 +0000 UTC m=+126.742283350" lastFinishedPulling="2026-04-16 16:25:27.609090392 +0000 UTC m=+131.443943285" observedRunningTime="2026-04-16 16:25:28.260893381 +0000 UTC m=+132.095746286" watchObservedRunningTime="2026-04-16 16:25:28.294799239 +0000 UTC m=+132.129652143" Apr 16 16:25:28.337896 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.336438 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7kmlg" podStartSLOduration=9.334728606 podStartE2EDuration="17.336418425s" podCreationTimestamp="2026-04-16 16:25:11 +0000 UTC" firstStartedPulling="2026-04-16 16:25:14.392992126 +0000 UTC m=+118.227845007" lastFinishedPulling="2026-04-16 16:25:22.39468193 +0000 UTC m=+126.229534826" observedRunningTime="2026-04-16 16:25:28.293225739 +0000 UTC m=+132.128078646" watchObservedRunningTime="2026-04-16 16:25:28.336418425 +0000 UTC m=+132.171271332" Apr 16 16:25:28.337896 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.337360 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56658d986b-vlstz" podStartSLOduration=7.413463632 podStartE2EDuration="12.337349047s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:22.65394759 +0000 UTC m=+126.488800478" lastFinishedPulling="2026-04-16 16:25:27.577833006 +0000 UTC m=+131.412685893" observedRunningTime="2026-04-16 16:25:28.336307174 +0000 UTC m=+132.171160115" watchObservedRunningTime="2026-04-16 16:25:28.337349047 +0000 UTC m=+132.172201951" Apr 16 16:25:28.379392 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:28.378926 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" podStartSLOduration=7.458158352 podStartE2EDuration="12.378904992s" podCreationTimestamp="2026-04-16 16:25:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:22.657470165 +0000 UTC m=+126.492323062" lastFinishedPulling="2026-04-16 16:25:27.578216816 +0000 UTC m=+131.413069702" observedRunningTime="2026-04-16 16:25:28.377276478 +0000 UTC m=+132.212129407" watchObservedRunningTime="2026-04-16 16:25:28.378904992 +0000 UTC m=+132.213757897" Apr 16 16:25:29.258144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:29.258080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f"} Apr 16 16:25:29.258144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:29.258150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750"} Apr 16 16:25:29.652438 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:29.652394 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:29.652438 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:29.652450 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:29.658141 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:29.658080 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:30.264727 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.264684 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerStarted","Data":"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60"} Apr 16 16:25:30.267702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.267648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"239a76a2e9e391c23a28e92c01a20feb10a8bdc458cd71b26925e84b741fa1a2"} Apr 16 16:25:30.267702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.267681 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"54364c78d334c29cadd0b9895e26b7135a8240f788d46e13d1ce8ddc26d1bb22"} Apr 16 16:25:30.271675 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.271648 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtw25" event={"ID":"073e645f-92a9-4855-9057-6a125ec9ebda","Type":"ContainerStarted","Data":"11efdca721d9b6b963610fb8fa7fe30bef0772c58307eecfbc40ada5af039164"} Apr 16 16:25:30.277078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.277052 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:25:30.303375 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.303300 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.718006344 podStartE2EDuration="18.303280866s" podCreationTimestamp="2026-04-16 16:25:12 +0000 UTC" firstStartedPulling="2026-04-16 16:25:14.381348806 +0000 UTC m=+118.216201688" lastFinishedPulling="2026-04-16 16:25:29.966623318 +0000 UTC m=+133.801476210" observedRunningTime="2026-04-16 16:25:30.29949886 +0000 UTC m=+134.134351787" watchObservedRunningTime="2026-04-16 16:25:30.303280866 +0000 UTC m=+134.138133772" Apr 16 16:25:30.370708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:30.370620 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:31.181062 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.181025 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:25:31.201762 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.201724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.213982 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.213950 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:25:31.224614 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224651 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224682 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224703 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224835 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224869 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.224901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.224892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmjgn\" (UniqueName: \"kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.279520 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.279483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" event={"ID":"d9b9c6d2-d1b2-4297-970e-6174d592cccd","Type":"ContainerStarted","Data":"dfaaebadbe17b62c88a7bd792f68bbcba5e9ea18dbd91e69ae7a8e0fe6354a46"} Apr 16 16:25:31.279962 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.279691 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:31.281734 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.281701 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mtw25" event={"ID":"073e645f-92a9-4855-9057-6a125ec9ebda","Type":"ContainerStarted","Data":"5e877b7687301e4acf0b61ec94caa60f8429598a8480b3291f9d086bc3c15fa3"} Apr 16 16:25:31.321306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.321241 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" podStartSLOduration=3.069202963 podStartE2EDuration="18.321222646s" podCreationTimestamp="2026-04-16 16:25:13 +0000 UTC" firstStartedPulling="2026-04-16 16:25:14.546368693 +0000 UTC m=+118.381221574" lastFinishedPulling="2026-04-16 16:25:29.798388362 +0000 UTC m=+133.633241257" observedRunningTime="2026-04-16 16:25:31.320827327 +0000 UTC m=+135.155680231" watchObservedRunningTime="2026-04-16 16:25:31.321222646 +0000 UTC m=+135.156075554" Apr 16 16:25:31.325938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.325904 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.326093 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.326011 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.326093 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.326057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.326579 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.326550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.326798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.326813 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.327021 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmjgn\" (UniqueName: \"kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.327037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.327161 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.327958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.327717 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.328301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.328241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.329266 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.329241 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.329365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.329296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.338279 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.338257 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmjgn\" (UniqueName: \"kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn\") pod \"console-959cd8fc-t4vzx\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.361746 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.361681 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mtw25" podStartSLOduration=133.344201816 podStartE2EDuration="2m15.361664446s" podCreationTimestamp="2026-04-16 16:23:16 +0000 UTC" firstStartedPulling="2026-04-16 16:25:27.778935302 +0000 UTC m=+131.613788191" lastFinishedPulling="2026-04-16 16:25:29.79639793 +0000 UTC m=+133.631250821" observedRunningTime="2026-04-16 16:25:31.357673373 +0000 UTC m=+135.192526276" watchObservedRunningTime="2026-04-16 16:25:31.361664446 +0000 UTC m=+135.196517351" Apr 16 16:25:31.513567 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.513477 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:31.694993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:31.694792 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:25:31.698090 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:25:31.698055 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b3891e5_7744_4d4b_a2bf_d43b4f68982b.slice/crio-17f497901d9e83f5c7bc8d3aa11c361922b81f557c044fcea395e5efd4a10541 WatchSource:0}: Error finding container 17f497901d9e83f5c7bc8d3aa11c361922b81f557c044fcea395e5efd4a10541: Status 404 returned error can't find the container with id 17f497901d9e83f5c7bc8d3aa11c361922b81f557c044fcea395e5efd4a10541 Apr 16 16:25:32.287959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:32.287914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-959cd8fc-t4vzx" event={"ID":"5b3891e5-7744-4d4b-a2bf-d43b4f68982b","Type":"ContainerStarted","Data":"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588"} Apr 16 16:25:32.287959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:32.287961 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-959cd8fc-t4vzx" event={"ID":"5b3891e5-7744-4d4b-a2bf-d43b4f68982b","Type":"ContainerStarted","Data":"17f497901d9e83f5c7bc8d3aa11c361922b81f557c044fcea395e5efd4a10541"} Apr 16 16:25:32.294839 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:32.294812 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67b7f774bc-wlc24" Apr 16 16:25:32.311907 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:32.311847 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-959cd8fc-t4vzx" podStartSLOduration=1.311825762 podStartE2EDuration="1.311825762s" podCreationTimestamp="2026-04-16 16:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:32.309945895 +0000 UTC m=+136.144798802" watchObservedRunningTime="2026-04-16 16:25:32.311825762 +0000 UTC m=+136.146678669" Apr 16 16:25:36.511193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:36.511149 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:36.511193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:36.511198 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:41.514127 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:41.514070 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:41.514615 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:41.514152 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:41.519224 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:41.519198 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:42.322251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:42.322222 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:25:42.386724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:42.386692 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:25:55.397897 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.397817 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-896c984d-8djv5" podUID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" containerName="console" containerID="cri-o://de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013" gracePeriod=15 Apr 16 16:25:55.682612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.682589 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-896c984d-8djv5_0a47cca3-ba8b-4e47-a1a7-6e586fc25037/console/0.log" Apr 16 16:25:55.682747 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.682652 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:55.851380 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851343 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851380 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851394 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851414 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851462 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851493 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851533 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqcbm\" (UniqueName: \"kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm\") pod \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\" (UID: \"0a47cca3-ba8b-4e47-a1a7-6e586fc25037\") " Apr 16 16:25:55.851959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851926 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config" (OuterVolumeSpecName: "console-config") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:55.852075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851958 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:55.852075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.851930 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:25:55.853793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.853760 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:55.853793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.853777 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:55.854386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.854357 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm" (OuterVolumeSpecName: "kube-api-access-hqcbm") pod "0a47cca3-ba8b-4e47-a1a7-6e586fc25037" (UID: "0a47cca3-ba8b-4e47-a1a7-6e586fc25037"). InnerVolumeSpecName "kube-api-access-hqcbm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952917 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-oauth-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952948 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952959 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-console-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952969 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-oauth-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952979 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-service-ca\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:55.953003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:55.952989 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqcbm\" (UniqueName: \"kubernetes.io/projected/0a47cca3-ba8b-4e47-a1a7-6e586fc25037-kube-api-access-hqcbm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:25:56.362271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362186 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-896c984d-8djv5_0a47cca3-ba8b-4e47-a1a7-6e586fc25037/console/0.log" Apr 16 16:25:56.362271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362230 2571 generic.go:358] "Generic (PLEG): container finished" podID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" containerID="de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013" exitCode=2 Apr 16 16:25:56.362492 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362292 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-896c984d-8djv5" Apr 16 16:25:56.362492 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-896c984d-8djv5" event={"ID":"0a47cca3-ba8b-4e47-a1a7-6e586fc25037","Type":"ContainerDied","Data":"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013"} Apr 16 16:25:56.362492 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-896c984d-8djv5" event={"ID":"0a47cca3-ba8b-4e47-a1a7-6e586fc25037","Type":"ContainerDied","Data":"e1b31a77180cce95098c48a7b59cc38b697dc93e963b26108e05257f6475dd45"} Apr 16 16:25:56.362492 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.362361 2571 scope.go:117] "RemoveContainer" containerID="de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013" Apr 16 16:25:56.370816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.370794 2571 scope.go:117] "RemoveContainer" containerID="de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013" Apr 16 16:25:56.371086 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:25:56.371064 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013\": container with ID starting with de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013 not found: ID does not exist" containerID="de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013" Apr 16 16:25:56.371156 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.371095 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013"} err="failed to get container status \"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013\": rpc error: code = NotFound desc = could not find container \"de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013\": container with ID starting with de340c05c4313004d17c1238fe48d112ae5ad6eee62d02f581c70b0dc7a9c013 not found: ID does not exist" Apr 16 16:25:56.388170 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.388136 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:56.397262 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.397236 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-896c984d-8djv5"] Apr 16 16:25:56.516383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.516353 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:56.520275 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.520255 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-575cb7dcdc-lzr8k" Apr 16 16:25:56.755568 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:25:56.755532 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" path="/var/lib/kubelet/pods/0a47cca3-ba8b-4e47-a1a7-6e586fc25037/volumes" Apr 16 16:26:07.408393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.408288 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f5ccf797f-x7vnb" podUID="761ea9d6-b520-497e-bcee-4fb188760ba1" containerName="console" containerID="cri-o://caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3" gracePeriod=15 Apr 16 16:26:07.675631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.675604 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5ccf797f-x7vnb_761ea9d6-b520-497e-bcee-4fb188760ba1/console/0.log" Apr 16 16:26:07.675780 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.675695 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:26:07.859805 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859770 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859818 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859840 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwb8w\" (UniqueName: \"kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859857 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859890 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859940 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.859986 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.859967 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca\") pod \"761ea9d6-b520-497e-bcee-4fb188760ba1\" (UID: \"761ea9d6-b520-497e-bcee-4fb188760ba1\") " Apr 16 16:26:07.860404 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.860376 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:07.860485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.860454 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config" (OuterVolumeSpecName: "console-config") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:07.860548 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.860489 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca" (OuterVolumeSpecName: "service-ca") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:07.860671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.860640 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:07.862209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.862181 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w" (OuterVolumeSpecName: "kube-api-access-qwb8w") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "kube-api-access-qwb8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:07.862209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.862192 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:07.862337 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.862211 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "761ea9d6-b520-497e-bcee-4fb188760ba1" (UID: "761ea9d6-b520-497e-bcee-4fb188760ba1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960718 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960756 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761ea9d6-b520-497e-bcee-4fb188760ba1-console-oauth-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960769 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwb8w\" (UniqueName: \"kubernetes.io/projected/761ea9d6-b520-497e-bcee-4fb188760ba1-kube-api-access-qwb8w\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960782 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-oauth-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960794 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-trusted-ca-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.960812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960807 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-console-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:07.961106 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:07.960820 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761ea9d6-b520-497e-bcee-4fb188760ba1-service-ca\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:08.406415 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406390 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5ccf797f-x7vnb_761ea9d6-b520-497e-bcee-4fb188760ba1/console/0.log" Apr 16 16:26:08.406596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406431 2571 generic.go:358] "Generic (PLEG): container finished" podID="761ea9d6-b520-497e-bcee-4fb188760ba1" containerID="caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3" exitCode=2 Apr 16 16:26:08.406596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5ccf797f-x7vnb" event={"ID":"761ea9d6-b520-497e-bcee-4fb188760ba1","Type":"ContainerDied","Data":"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3"} Apr 16 16:26:08.406596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5ccf797f-x7vnb" event={"ID":"761ea9d6-b520-497e-bcee-4fb188760ba1","Type":"ContainerDied","Data":"df72a5f70d7605629e58b068d92a902dc61f321eea44c98c1911a402c906d408"} Apr 16 16:26:08.406596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406513 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5ccf797f-x7vnb" Apr 16 16:26:08.406785 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.406520 2571 scope.go:117] "RemoveContainer" containerID="caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3" Apr 16 16:26:08.415200 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.415017 2571 scope.go:117] "RemoveContainer" containerID="caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3" Apr 16 16:26:08.415459 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:08.415357 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3\": container with ID starting with caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3 not found: ID does not exist" containerID="caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3" Apr 16 16:26:08.415459 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.415382 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3"} err="failed to get container status \"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3\": rpc error: code = NotFound desc = could not find container \"caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3\": container with ID starting with caf98e0dcf945f857ad87e7bbf838dcdc593d32d90b4ca748eb77b0575fa0ce3 not found: ID does not exist" Apr 16 16:26:08.428705 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.428674 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:26:08.432296 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.432269 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f5ccf797f-x7vnb"] Apr 16 16:26:08.754787 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:08.754708 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761ea9d6-b520-497e-bcee-4fb188760ba1" path="/var/lib/kubelet/pods/761ea9d6-b520-497e-bcee-4fb188760ba1/volumes" Apr 16 16:26:10.414298 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:10.414261 2571 generic.go:358] "Generic (PLEG): container finished" podID="aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a" containerID="252df2a085dff4c20ff9d9518ce81c3075e002aa0924cefe4d18a0f5ae94eccc" exitCode=0 Apr 16 16:26:10.414675 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:10.414336 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" event={"ID":"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a","Type":"ContainerDied","Data":"252df2a085dff4c20ff9d9518ce81c3075e002aa0924cefe4d18a0f5ae94eccc"} Apr 16 16:26:10.414717 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:10.414686 2571 scope.go:117] "RemoveContainer" containerID="252df2a085dff4c20ff9d9518ce81c3075e002aa0924cefe4d18a0f5ae94eccc" Apr 16 16:26:11.418530 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:11.418493 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-hxp97" event={"ID":"aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a","Type":"ContainerStarted","Data":"ea880ae6edcd795d12801188983982e51bd9db9704859795c12a7c5101b9377d"} Apr 16 16:26:32.107327 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107289 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:32.107802 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107745 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="alertmanager" containerID="cri-o://fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4" gracePeriod=120 Apr 16 16:26:32.107946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107818 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-metric" containerID="cri-o://5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f" gracePeriod=120 Apr 16 16:26:32.107946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107840 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="prom-label-proxy" containerID="cri-o://22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60" gracePeriod=120 Apr 16 16:26:32.107946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107879 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-web" containerID="cri-o://c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec" gracePeriod=120 Apr 16 16:26:32.108105 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.107909 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="config-reloader" containerID="cri-o://ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd" gracePeriod=120 Apr 16 16:26:32.108105 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.108036 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy" containerID="cri-o://ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750" gracePeriod=120 Apr 16 16:26:32.490379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490336 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60" exitCode=0 Apr 16 16:26:32.490379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490366 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f" exitCode=0 Apr 16 16:26:32.490379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490376 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750" exitCode=0 Apr 16 16:26:32.490379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490384 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd" exitCode=0 Apr 16 16:26:32.490379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490391 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4" exitCode=0 Apr 16 16:26:32.490671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490413 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60"} Apr 16 16:26:32.490671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490451 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f"} Apr 16 16:26:32.490671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750"} Apr 16 16:26:32.490671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490473 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd"} Apr 16 16:26:32.490671 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:32.490482 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4"} Apr 16 16:26:33.354742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.354716 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.480425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480324 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480389 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480421 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480457 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480486 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480529 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480556 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480590 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9694\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480620 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480671 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.480702 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480698 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.481065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480732 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.481065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480768 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy\") pod \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\" (UID: \"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a\") " Apr 16 16:26:33.481065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.480884 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:33.481065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.481026 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:33.481065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.481047 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-main-db\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.481355 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.481087 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:26:33.483613 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.483579 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.483967 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.483911 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:33.484076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.484005 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out" (OuterVolumeSpecName: "config-out") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:33.484360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.484247 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.484360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.484331 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.484500 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.484452 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694" (OuterVolumeSpecName: "kube-api-access-q9694") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "kube-api-access-q9694". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:26:33.484832 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.484811 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.485736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.485716 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.487926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.487902 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.495473 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.495440 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config" (OuterVolumeSpecName: "web-config") pod "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" (UID: "ce0cb5df-2680-43a9-99ad-c833dbfe3a7a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:26:33.496940 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.496914 2571 generic.go:358] "Generic (PLEG): container finished" podID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerID="c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec" exitCode=0 Apr 16 16:26:33.497036 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.496991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec"} Apr 16 16:26:33.497036 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.497027 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.497165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.497043 2571 scope.go:117] "RemoveContainer" containerID="22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60" Apr 16 16:26:33.497165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.497031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ce0cb5df-2680-43a9-99ad-c833dbfe3a7a","Type":"ContainerDied","Data":"50a6df6e4002f46676a8136ad33c367a1e837064f2271c67d8dc97c235c854be"} Apr 16 16:26:33.505714 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.505695 2571 scope.go:117] "RemoveContainer" containerID="5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f" Apr 16 16:26:33.512253 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.512236 2571 scope.go:117] "RemoveContainer" containerID="ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750" Apr 16 16:26:33.518695 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.518664 2571 scope.go:117] "RemoveContainer" containerID="c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec" Apr 16 16:26:33.525261 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.525243 2571 scope.go:117] "RemoveContainer" containerID="ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd" Apr 16 16:26:33.529568 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.529522 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:33.531954 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.531931 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:33.533022 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.533007 2571 scope.go:117] "RemoveContainer" containerID="fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4" Apr 16 16:26:33.539517 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.539495 2571 scope.go:117] "RemoveContainer" containerID="7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18" Apr 16 16:26:33.545928 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.545908 2571 scope.go:117] "RemoveContainer" containerID="22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60" Apr 16 16:26:33.546207 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.546182 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60\": container with ID starting with 22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60 not found: ID does not exist" containerID="22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60" Apr 16 16:26:33.546272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546219 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60"} err="failed to get container status \"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60\": rpc error: code = NotFound desc = could not find container \"22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60\": container with ID starting with 22a104b391945bfccca75131e3c83d67072c8a48f0ba42ec81e7e4c97552da60 not found: ID does not exist" Apr 16 16:26:33.546272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546238 2571 scope.go:117] "RemoveContainer" containerID="5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f" Apr 16 16:26:33.546511 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.546492 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f\": container with ID starting with 5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f not found: ID does not exist" containerID="5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f" Apr 16 16:26:33.546555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546518 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f"} err="failed to get container status \"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f\": rpc error: code = NotFound desc = could not find container \"5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f\": container with ID starting with 5e5c8cd1aa3e5eb605edf5e4e749d15624fafa483c11ae7aebecd9eaf82c1a8f not found: ID does not exist" Apr 16 16:26:33.546555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546534 2571 scope.go:117] "RemoveContainer" containerID="ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750" Apr 16 16:26:33.546775 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.546756 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750\": container with ID starting with ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750 not found: ID does not exist" containerID="ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750" Apr 16 16:26:33.546837 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546797 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750"} err="failed to get container status \"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750\": rpc error: code = NotFound desc = could not find container \"ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750\": container with ID starting with ac8fc4145cb1a6dc149fdc9b0859fbabd70ae1a258c0ddeb7bf9ace78a12e750 not found: ID does not exist" Apr 16 16:26:33.546837 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.546821 2571 scope.go:117] "RemoveContainer" containerID="c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec" Apr 16 16:26:33.547053 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.547036 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec\": container with ID starting with c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec not found: ID does not exist" containerID="c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec" Apr 16 16:26:33.547091 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547059 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec"} err="failed to get container status \"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec\": rpc error: code = NotFound desc = could not find container \"c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec\": container with ID starting with c1f9f25f234cc5a2c3ece11d282d02a9e8b91567db67cf103e095d8203e23dec not found: ID does not exist" Apr 16 16:26:33.547091 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547076 2571 scope.go:117] "RemoveContainer" containerID="ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd" Apr 16 16:26:33.547338 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.547321 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd\": container with ID starting with ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd not found: ID does not exist" containerID="ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd" Apr 16 16:26:33.547386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547342 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd"} err="failed to get container status \"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd\": rpc error: code = NotFound desc = could not find container \"ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd\": container with ID starting with ab415ec96a4257bb5ea333a1f54ef2bd65ac11550d4454590fe9f0d5fe4011bd not found: ID does not exist" Apr 16 16:26:33.547386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547359 2571 scope.go:117] "RemoveContainer" containerID="fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4" Apr 16 16:26:33.547604 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.547585 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4\": container with ID starting with fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4 not found: ID does not exist" containerID="fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4" Apr 16 16:26:33.547646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547608 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4"} err="failed to get container status \"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4\": rpc error: code = NotFound desc = could not find container \"fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4\": container with ID starting with fc5b37b53d61ddc8acbdb182f321cdb63f66653fcffb4ed01d25f065797d62d4 not found: ID does not exist" Apr 16 16:26:33.547646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547621 2571 scope.go:117] "RemoveContainer" containerID="7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18" Apr 16 16:26:33.547808 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:26:33.547794 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18\": container with ID starting with 7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18 not found: ID does not exist" containerID="7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18" Apr 16 16:26:33.547853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.547809 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18"} err="failed to get container status \"7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18\": rpc error: code = NotFound desc = could not find container \"7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18\": container with ID starting with 7de67e3be70c3961a0853d534c17d8a3bfb50f84deecfefb6c8259552d4b5f18 not found: ID does not exist" Apr 16 16:26:33.571705 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.571671 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:33.571998 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.571985 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" containerName="console" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572000 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" containerName="console" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572010 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572017 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572024 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="init-config-reloader" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572030 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="init-config-reloader" Apr 16 16:26:33.572040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572040 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-web" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572049 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-web" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572063 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="prom-label-proxy" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572068 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="prom-label-proxy" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572078 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="config-reloader" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572083 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="config-reloader" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572092 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="alertmanager" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572097 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="alertmanager" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572104 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="761ea9d6-b520-497e-bcee-4fb188760ba1" containerName="console" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572109 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="761ea9d6-b520-497e-bcee-4fb188760ba1" containerName="console" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572132 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-metric" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572138 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-metric" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572191 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572203 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-web" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572214 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="kube-rbac-proxy-metric" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572221 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a47cca3-ba8b-4e47-a1a7-6e586fc25037" containerName="console" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572228 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="761ea9d6-b520-497e-bcee-4fb188760ba1" containerName="console" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572234 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="prom-label-proxy" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572241 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="config-reloader" Apr 16 16:26:33.572247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.572248 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" containerName="alertmanager" Apr 16 16:26:33.577393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.577372 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.580075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580055 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:26:33.580331 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580309 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nzr72\"" Apr 16 16:26:33.580423 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:26:33.580423 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580366 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:26:33.580423 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:26:33.580571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580353 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:26:33.580571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580374 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:26:33.580571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580504 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:26:33.580690 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.580677 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:26:33.581656 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581639 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-cluster-tls-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581660 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581675 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-volume\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581689 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9694\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-kube-api-access-q9694\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581698 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-metrics-client-ca\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581706 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-config-out\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581716 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-web-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581724 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581734 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581743 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-main-tls\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581752 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.581919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.581762 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a-tls-assets\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:26:33.587396 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.587376 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:26:33.593374 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.593351 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:33.683086 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683086 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683088 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683189 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683229 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-out\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-web-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683344 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lth7\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-kube-api-access-8lth7\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683475 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.683505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.683493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784612 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784641 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-out\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-web-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lth7\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-kube-api-access-8lth7\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784781 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.784938 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.785373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.784944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.785373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.785060 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.785723 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.785682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.786638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.786307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea169814-38bf-4113-95a2-44b8fef7b9b0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.787721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.787696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-out\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.787811 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.787790 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.787953 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.787926 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.788053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.788017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-config-volume\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.788143 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.788085 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.788220 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.788196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-web-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.788322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.788302 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.788782 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.788763 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.789811 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.789793 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ea169814-38bf-4113-95a2-44b8fef7b9b0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.794019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.794000 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lth7\" (UniqueName: \"kubernetes.io/projected/ea169814-38bf-4113-95a2-44b8fef7b9b0-kube-api-access-8lth7\") pod \"alertmanager-main-0\" (UID: \"ea169814-38bf-4113-95a2-44b8fef7b9b0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:33.887873 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:33.887832 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:26:34.025512 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:34.025486 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:26:34.028468 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:26:34.028435 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea169814_38bf_4113_95a2_44b8fef7b9b0.slice/crio-978ed546d31f1b6d534a5bebabe9cdd3128f00476310fd2f7005021dda6e5a72 WatchSource:0}: Error finding container 978ed546d31f1b6d534a5bebabe9cdd3128f00476310fd2f7005021dda6e5a72: Status 404 returned error can't find the container with id 978ed546d31f1b6d534a5bebabe9cdd3128f00476310fd2f7005021dda6e5a72 Apr 16 16:26:34.501729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:34.501688 2571 generic.go:358] "Generic (PLEG): container finished" podID="ea169814-38bf-4113-95a2-44b8fef7b9b0" containerID="a16b1c981a722787e2f410ef8f15111856061a4fb90030bbeb9f51016f6fe48a" exitCode=0 Apr 16 16:26:34.502173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:34.501775 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerDied","Data":"a16b1c981a722787e2f410ef8f15111856061a4fb90030bbeb9f51016f6fe48a"} Apr 16 16:26:34.502173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:34.501809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"978ed546d31f1b6d534a5bebabe9cdd3128f00476310fd2f7005021dda6e5a72"} Apr 16 16:26:34.755545 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:34.755517 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0cb5df-2680-43a9-99ad-c833dbfe3a7a" path="/var/lib/kubelet/pods/ce0cb5df-2680-43a9-99ad-c833dbfe3a7a/volumes" Apr 16 16:26:35.508761 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"974bb7cfeb02479a62f63cd8f6fe8d854b08c5fdf60bde123f9583384abd5b0b"} Apr 16 16:26:35.508761 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508765 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"3c9270fe65ba1b5eec74b8d9290fcd9f1351da5596df00a59ff5ea902c567f12"} Apr 16 16:26:35.509281 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"061bfa8fdb750ed59c7ff0afbab3dc589d00475107855c5188d7c53ec78afda6"} Apr 16 16:26:35.509281 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508790 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"34634cda22d3d2260f32a54a5230fa7cf39bea968a9b47000ef9d0a48599e510"} Apr 16 16:26:35.509281 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"6f4bb87f0dfa686b0d012d0582c47f7b9a517127ae4274dfc8930cdf20dd0abe"} Apr 16 16:26:35.509281 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.508812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ea169814-38bf-4113-95a2-44b8fef7b9b0","Type":"ContainerStarted","Data":"4de1508546eb308c53d2708d1a348a4f22fe9a6574d95ff433e55a828781fd66"} Apr 16 16:26:35.535486 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:35.535428 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.535410284 podStartE2EDuration="2.535410284s" podCreationTimestamp="2026-04-16 16:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:26:35.53308036 +0000 UTC m=+199.367933263" watchObservedRunningTime="2026-04-16 16:26:35.535410284 +0000 UTC m=+199.370263187" Apr 16 16:26:45.660352 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.660317 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:26:45.664748 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.664721 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.674343 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.674161 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:26:45.790963 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.790922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791109 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.790971 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xq7\" (UniqueName: \"kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791109 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.791017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791109 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.791033 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791109 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.791058 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791109 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.791111 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.791329 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.791162 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892412 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xq7\" (UniqueName: \"kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892800 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892619 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.892800 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.892653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.893278 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.893248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.893399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.893281 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.893399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.893382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.893575 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.893557 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.894928 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.894911 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.895075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.895057 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.901043 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.901006 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xq7\" (UniqueName: \"kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7\") pod \"console-cd89987cd-p4kp6\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:45.977889 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:45.977799 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:46.125383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:46.125346 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:26:46.129024 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:26:46.128995 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02016f6b_3a13_4224_98a2_f77e0c52261f.slice/crio-36643431bb9fca1c7a9cb7cf612a557555870894f72e7f7e62b50994cb744d76 WatchSource:0}: Error finding container 36643431bb9fca1c7a9cb7cf612a557555870894f72e7f7e62b50994cb744d76: Status 404 returned error can't find the container with id 36643431bb9fca1c7a9cb7cf612a557555870894f72e7f7e62b50994cb744d76 Apr 16 16:26:46.543463 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:46.543428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd89987cd-p4kp6" event={"ID":"02016f6b-3a13-4224-98a2-f77e0c52261f","Type":"ContainerStarted","Data":"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e"} Apr 16 16:26:46.543463 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:46.543466 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd89987cd-p4kp6" event={"ID":"02016f6b-3a13-4224-98a2-f77e0c52261f","Type":"ContainerStarted","Data":"36643431bb9fca1c7a9cb7cf612a557555870894f72e7f7e62b50994cb744d76"} Apr 16 16:26:46.564008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:46.563951 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cd89987cd-p4kp6" podStartSLOduration=1.563933998 podStartE2EDuration="1.563933998s" podCreationTimestamp="2026-04-16 16:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:26:46.562269192 +0000 UTC m=+210.397122096" watchObservedRunningTime="2026-04-16 16:26:46.563933998 +0000 UTC m=+210.398786901" Apr 16 16:26:55.978473 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:55.978429 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:55.978473 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:55.978479 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:55.983432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:55.983405 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:56.579921 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:56.579894 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:26:56.638291 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:26:56.638254 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:27:21.324350 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.324308 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-f7w8r"] Apr 16 16:27:21.329171 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.329146 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.331442 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.331422 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:27:21.334478 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.334453 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f7w8r"] Apr 16 16:27:21.396668 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.396626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-dbus\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.396884 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.396680 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c466876b-30ba-4711-affc-092c2f5418b3-original-pull-secret\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.396884 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.396805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-kubelet-config\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.497603 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.497566 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-kubelet-config\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.497765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.497629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-dbus\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.497765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.497656 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c466876b-30ba-4711-affc-092c2f5418b3-original-pull-secret\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.497765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.497701 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-kubelet-config\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.497882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.497814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c466876b-30ba-4711-affc-092c2f5418b3-dbus\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.499894 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.499878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c466876b-30ba-4711-affc-092c2f5418b3-original-pull-secret\") pod \"global-pull-secret-syncer-f7w8r\" (UID: \"c466876b-30ba-4711-affc-092c2f5418b3\") " pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.639835 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.639745 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-f7w8r" Apr 16 16:27:21.660312 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.660275 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-959cd8fc-t4vzx" podUID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" containerName="console" containerID="cri-o://e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588" gracePeriod=15 Apr 16 16:27:21.765571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.765536 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-f7w8r"] Apr 16 16:27:21.768389 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:27:21.768352 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc466876b_30ba_4711_affc_092c2f5418b3.slice/crio-faa025fd2b585a2c8227146a05181f178542f9686a5605a67a543a1d21bd6dc8 WatchSource:0}: Error finding container faa025fd2b585a2c8227146a05181f178542f9686a5605a67a543a1d21bd6dc8: Status 404 returned error can't find the container with id faa025fd2b585a2c8227146a05181f178542f9686a5605a67a543a1d21bd6dc8 Apr 16 16:27:21.883588 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.883559 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-959cd8fc-t4vzx_5b3891e5-7744-4d4b-a2bf-d43b4f68982b/console/0.log" Apr 16 16:27:21.883742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:21.883624 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:27:22.002852 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002810 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.002852 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002847 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002875 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002909 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002963 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.002995 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmjgn\" (UniqueName: \"kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.003018 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config\") pod \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\" (UID: \"5b3891e5-7744-4d4b-a2bf-d43b4f68982b\") " Apr 16 16:27:22.003470 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.003435 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:27:22.003563 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.003441 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:27:22.003616 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.003562 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config" (OuterVolumeSpecName: "console-config") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:27:22.003714 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.003460 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:27:22.005194 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.005171 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:27:22.005303 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.005195 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:27:22.005303 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.005251 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn" (OuterVolumeSpecName: "kube-api-access-qmjgn") pod "5b3891e5-7744-4d4b-a2bf-d43b4f68982b" (UID: "5b3891e5-7744-4d4b-a2bf-d43b4f68982b"). InnerVolumeSpecName "kube-api-access-qmjgn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:27:22.104297 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104262 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-oauth-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104297 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104296 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104309 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-console-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104322 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-oauth-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104333 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-trusted-ca-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104345 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-service-ca\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.104507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.104357 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmjgn\" (UniqueName: \"kubernetes.io/projected/5b3891e5-7744-4d4b-a2bf-d43b4f68982b-kube-api-access-qmjgn\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.655215 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655183 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-959cd8fc-t4vzx_5b3891e5-7744-4d4b-a2bf-d43b4f68982b/console/0.log" Apr 16 16:27:22.655749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655233 2571 generic.go:358] "Generic (PLEG): container finished" podID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" containerID="e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588" exitCode=2 Apr 16 16:27:22.655749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-959cd8fc-t4vzx" event={"ID":"5b3891e5-7744-4d4b-a2bf-d43b4f68982b","Type":"ContainerDied","Data":"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588"} Apr 16 16:27:22.655749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655321 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-959cd8fc-t4vzx" event={"ID":"5b3891e5-7744-4d4b-a2bf-d43b4f68982b","Type":"ContainerDied","Data":"17f497901d9e83f5c7bc8d3aa11c361922b81f557c044fcea395e5efd4a10541"} Apr 16 16:27:22.655749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655321 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-959cd8fc-t4vzx" Apr 16 16:27:22.655749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.655337 2571 scope.go:117] "RemoveContainer" containerID="e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588" Apr 16 16:27:22.656764 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.656479 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f7w8r" event={"ID":"c466876b-30ba-4711-affc-092c2f5418b3","Type":"ContainerStarted","Data":"faa025fd2b585a2c8227146a05181f178542f9686a5605a67a543a1d21bd6dc8"} Apr 16 16:27:22.668798 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.668777 2571 scope.go:117] "RemoveContainer" containerID="e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588" Apr 16 16:27:22.669203 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:27:22.669171 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588\": container with ID starting with e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588 not found: ID does not exist" containerID="e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588" Apr 16 16:27:22.669306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.669211 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588"} err="failed to get container status \"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588\": rpc error: code = NotFound desc = could not find container \"e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588\": container with ID starting with e29e406c706e1b87e6ab95c0c0f3cb067131725a2e5359ed7d02dccda2f04588 not found: ID does not exist" Apr 16 16:27:22.686332 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.686301 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:27:22.689463 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.689437 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-959cd8fc-t4vzx"] Apr 16 16:27:22.755631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:22.755548 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" path="/var/lib/kubelet/pods/5b3891e5-7744-4d4b-a2bf-d43b4f68982b/volumes" Apr 16 16:27:26.671908 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:26.671873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-f7w8r" event={"ID":"c466876b-30ba-4711-affc-092c2f5418b3","Type":"ContainerStarted","Data":"1e4b9ba7ad2f017dc02d9c144a077873ad3e702d991ed2347d080a7eda0332dd"} Apr 16 16:27:26.687772 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:27:26.686862 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-f7w8r" podStartSLOduration=1.31391412 podStartE2EDuration="5.686844952s" podCreationTimestamp="2026-04-16 16:27:21 +0000 UTC" firstStartedPulling="2026-04-16 16:27:21.770296254 +0000 UTC m=+245.605149136" lastFinishedPulling="2026-04-16 16:27:26.143227086 +0000 UTC m=+249.978079968" observedRunningTime="2026-04-16 16:27:26.685656669 +0000 UTC m=+250.520509573" watchObservedRunningTime="2026-04-16 16:27:26.686844952 +0000 UTC m=+250.521697862" Apr 16 16:28:16.652286 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:28:16.652253 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:28:16.652835 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:28:16.652427 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:28:16.656192 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:28:16.656173 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:28:16.656325 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:28:16.656180 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:28:16.661191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:28:16.661173 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:29:42.313680 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.313594 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv"] Apr 16 16:29:42.314177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.313998 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" containerName="console" Apr 16 16:29:42.314177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.314012 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" containerName="console" Apr 16 16:29:42.314177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.314091 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b3891e5-7744-4d4b-a2bf-d43b4f68982b" containerName="console" Apr 16 16:29:42.317108 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.317084 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.319799 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.319770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:29:42.320777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.320757 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:29:42.320922 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.320761 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:29:42.336255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.336224 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv"] Apr 16 16:29:42.396026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.395987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.396026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.396024 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.396290 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.396177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq2q\" (UniqueName: \"kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.497274 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.497230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq2q\" (UniqueName: \"kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.497456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.497294 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.497456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.497318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.497744 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.497723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.497744 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.497738 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.506234 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.506211 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq2q\" (UniqueName: \"kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.626933 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.626822 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:29:42.750988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.750964 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv"] Apr 16 16:29:42.758217 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:42.758194 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:29:43.071836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:43.071797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" event={"ID":"0adb4302-a287-4b4d-902d-15d82c49a845","Type":"ContainerStarted","Data":"04fd67e132cbefa709cbc6f9c66db14b0c17d9dc526d0a97f2c3d0ac1e72b34e"} Apr 16 16:29:48.089488 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:48.089448 2571 generic.go:358] "Generic (PLEG): container finished" podID="0adb4302-a287-4b4d-902d-15d82c49a845" containerID="e6c40eebfefe678fdc959a41c94cfaf10f250beee8ab811c62190300d92209ff" exitCode=0 Apr 16 16:29:48.089897 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:48.089536 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" event={"ID":"0adb4302-a287-4b4d-902d-15d82c49a845","Type":"ContainerDied","Data":"e6c40eebfefe678fdc959a41c94cfaf10f250beee8ab811c62190300d92209ff"} Apr 16 16:29:51.101667 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:51.101578 2571 generic.go:358] "Generic (PLEG): container finished" podID="0adb4302-a287-4b4d-902d-15d82c49a845" containerID="eae1864932bf3da67b4da7f000011b4936e31537692d6268502fed3592475449" exitCode=0 Apr 16 16:29:51.102021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:29:51.101663 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" event={"ID":"0adb4302-a287-4b4d-902d-15d82c49a845","Type":"ContainerDied","Data":"eae1864932bf3da67b4da7f000011b4936e31537692d6268502fed3592475449"} Apr 16 16:30:00.130000 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:00.129962 2571 generic.go:358] "Generic (PLEG): container finished" podID="0adb4302-a287-4b4d-902d-15d82c49a845" containerID="373030fabdd8e0ed1154c0b53a6825a110ca979788583087ac63543b400f8a2b" exitCode=0 Apr 16 16:30:00.130390 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:00.130009 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" event={"ID":"0adb4302-a287-4b4d-902d-15d82c49a845","Type":"ContainerDied","Data":"373030fabdd8e0ed1154c0b53a6825a110ca979788583087ac63543b400f8a2b"} Apr 16 16:30:01.259091 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.259067 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:30:01.363483 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.363448 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsq2q\" (UniqueName: \"kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q\") pod \"0adb4302-a287-4b4d-902d-15d82c49a845\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " Apr 16 16:30:01.363654 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.363516 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle\") pod \"0adb4302-a287-4b4d-902d-15d82c49a845\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " Apr 16 16:30:01.363654 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.363585 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util\") pod \"0adb4302-a287-4b4d-902d-15d82c49a845\" (UID: \"0adb4302-a287-4b4d-902d-15d82c49a845\") " Apr 16 16:30:01.364228 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.364200 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle" (OuterVolumeSpecName: "bundle") pod "0adb4302-a287-4b4d-902d-15d82c49a845" (UID: "0adb4302-a287-4b4d-902d-15d82c49a845"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:30:01.365720 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.365695 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q" (OuterVolumeSpecName: "kube-api-access-rsq2q") pod "0adb4302-a287-4b4d-902d-15d82c49a845" (UID: "0adb4302-a287-4b4d-902d-15d82c49a845"). InnerVolumeSpecName "kube-api-access-rsq2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:30:01.369085 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.369049 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util" (OuterVolumeSpecName: "util") pod "0adb4302-a287-4b4d-902d-15d82c49a845" (UID: "0adb4302-a287-4b4d-902d-15d82c49a845"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:30:01.464988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.464939 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:30:01.464988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.464983 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsq2q\" (UniqueName: \"kubernetes.io/projected/0adb4302-a287-4b4d-902d-15d82c49a845-kube-api-access-rsq2q\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:30:01.464988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:01.464996 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0adb4302-a287-4b4d-902d-15d82c49a845-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:30:02.138098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:02.138062 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" event={"ID":"0adb4302-a287-4b4d-902d-15d82c49a845","Type":"ContainerDied","Data":"04fd67e132cbefa709cbc6f9c66db14b0c17d9dc526d0a97f2c3d0ac1e72b34e"} Apr 16 16:30:02.138098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:02.138086 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csrgpv" Apr 16 16:30:02.138098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:02.138096 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04fd67e132cbefa709cbc6f9c66db14b0c17d9dc526d0a97f2c3d0ac1e72b34e" Apr 16 16:30:04.337007 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.336974 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl"] Apr 16 16:30:04.337431 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337413 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="extract" Apr 16 16:30:04.337431 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337427 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="extract" Apr 16 16:30:04.337551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337437 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="pull" Apr 16 16:30:04.337551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337443 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="pull" Apr 16 16:30:04.337551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337452 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="util" Apr 16 16:30:04.337551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337457 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="util" Apr 16 16:30:04.337551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.337504 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0adb4302-a287-4b4d-902d-15d82c49a845" containerName="extract" Apr 16 16:30:04.377012 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.376982 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl"] Apr 16 16:30:04.377195 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.377103 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.379843 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.379816 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:30:04.379969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.379868 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:30:04.380194 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.380174 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:30:04.380282 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.380265 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-8tpfk\"" Apr 16 16:30:04.488346 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.488303 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkf2\" (UniqueName: \"kubernetes.io/projected/b9b64de3-181e-4641-8f1f-8c042b0c7870-kube-api-access-zvkf2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.488518 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.488374 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9b64de3-181e-4641-8f1f-8c042b0c7870-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.589539 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.589464 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkf2\" (UniqueName: \"kubernetes.io/projected/b9b64de3-181e-4641-8f1f-8c042b0c7870-kube-api-access-zvkf2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.589539 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.589530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9b64de3-181e-4641-8f1f-8c042b0c7870-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.591800 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.591774 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/b9b64de3-181e-4641-8f1f-8c042b0c7870-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.600026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.599992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkf2\" (UniqueName: \"kubernetes.io/projected/b9b64de3-181e-4641-8f1f-8c042b0c7870-kube-api-access-zvkf2\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl\" (UID: \"b9b64de3-181e-4641-8f1f-8c042b0c7870\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.687628 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.687590 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:04.831248 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:04.831210 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl"] Apr 16 16:30:04.835206 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:30:04.835174 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b64de3_181e_4641_8f1f_8c042b0c7870.slice/crio-b8ca288d56677014c0f5375c3da46c100c135e7053f3936c6fa15a82e8b54bf2 WatchSource:0}: Error finding container b8ca288d56677014c0f5375c3da46c100c135e7053f3936c6fa15a82e8b54bf2: Status 404 returned error can't find the container with id b8ca288d56677014c0f5375c3da46c100c135e7053f3936c6fa15a82e8b54bf2 Apr 16 16:30:05.147891 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:05.147856 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" event={"ID":"b9b64de3-181e-4641-8f1f-8c042b0c7870","Type":"ContainerStarted","Data":"b8ca288d56677014c0f5375c3da46c100c135e7053f3936c6fa15a82e8b54bf2"} Apr 16 16:30:11.171642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.171605 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" event={"ID":"b9b64de3-181e-4641-8f1f-8c042b0c7870","Type":"ContainerStarted","Data":"eed874e056e63e0d803e92d9057460873b967c4a97d6dfe280d5ead2e08b963e"} Apr 16 16:30:11.172163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.171731 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:11.192866 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.192817 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" podStartSLOduration=0.99270248 podStartE2EDuration="7.192801278s" podCreationTimestamp="2026-04-16 16:30:04 +0000 UTC" firstStartedPulling="2026-04-16 16:30:04.83687738 +0000 UTC m=+408.671730262" lastFinishedPulling="2026-04-16 16:30:11.036976165 +0000 UTC m=+414.871829060" observedRunningTime="2026-04-16 16:30:11.191170406 +0000 UTC m=+415.026023307" watchObservedRunningTime="2026-04-16 16:30:11.192801278 +0000 UTC m=+415.027654181" Apr 16 16:30:11.974897 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.974863 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4w487"] Apr 16 16:30:11.978209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.978190 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:11.980531 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.980506 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:30:11.980759 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.980741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5xrck\"" Apr 16 16:30:11.980963 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.980950 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:30:11.988993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:11.988957 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4w487"] Apr 16 16:30:12.055390 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.055351 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46s5n\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-kube-api-access-46s5n\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.055571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.055421 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.055571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.055453 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/739bf4b6-710a-4951-b918-2fc347aabf9a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.156420 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.156386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-46s5n\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-kube-api-access-46s5n\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.156584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.156441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.156584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.156468 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/739bf4b6-710a-4951-b918-2fc347aabf9a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.156653 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.156592 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:30:12.156653 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.156609 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:30:12.156653 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.156632 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4w487: references non-existent secret key: tls.crt Apr 16 16:30:12.156761 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.156705 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates podName:739bf4b6-710a-4951-b918-2fc347aabf9a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:12.656681982 +0000 UTC m=+416.491534883 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates") pod "keda-metrics-apiserver-7c9f485588-4w487" (UID: "739bf4b6-710a-4951-b918-2fc347aabf9a") : references non-existent secret key: tls.crt Apr 16 16:30:12.156961 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.156942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/739bf4b6-710a-4951-b918-2fc347aabf9a-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.167597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.167568 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-46s5n\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-kube-api-access-46s5n\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.660888 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:12.660853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:12.661326 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.660974 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:30:12.661326 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.660986 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:30:12.661326 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.661004 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4w487: references non-existent secret key: tls.crt Apr 16 16:30:12.661326 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:12.661076 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates podName:739bf4b6-710a-4951-b918-2fc347aabf9a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:13.661048886 +0000 UTC m=+417.495901784 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates") pod "keda-metrics-apiserver-7c9f485588-4w487" (UID: "739bf4b6-710a-4951-b918-2fc347aabf9a") : references non-existent secret key: tls.crt Apr 16 16:30:13.669904 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:13.669857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:13.670337 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:13.670008 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:30:13.670337 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:13.670029 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:30:13.670337 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:13.670048 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4w487: references non-existent secret key: tls.crt Apr 16 16:30:13.670337 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:13.670101 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates podName:739bf4b6-710a-4951-b918-2fc347aabf9a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:15.67008535 +0000 UTC m=+419.504938232 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates") pod "keda-metrics-apiserver-7c9f485588-4w487" (UID: "739bf4b6-710a-4951-b918-2fc347aabf9a") : references non-existent secret key: tls.crt Apr 16 16:30:15.687746 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:15.687698 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:15.688211 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:15.687841 2571 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:30:15.688211 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:15.687862 2571 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:30:15.688211 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:15.687882 2571 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-4w487: references non-existent secret key: tls.crt Apr 16 16:30:15.688211 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:30:15.687934 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates podName:739bf4b6-710a-4951-b918-2fc347aabf9a nodeName:}" failed. No retries permitted until 2026-04-16 16:30:19.68791934 +0000 UTC m=+423.522772222 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates") pod "keda-metrics-apiserver-7c9f485588-4w487" (UID: "739bf4b6-710a-4951-b918-2fc347aabf9a") : references non-existent secret key: tls.crt Apr 16 16:30:19.721106 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:19.721067 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:19.723809 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:19.723781 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/739bf4b6-710a-4951-b918-2fc347aabf9a-certificates\") pod \"keda-metrics-apiserver-7c9f485588-4w487\" (UID: \"739bf4b6-710a-4951-b918-2fc347aabf9a\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:19.790840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:19.790809 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-5xrck\"" Apr 16 16:30:19.799666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:19.799640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:19.919964 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:19.919935 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-4w487"] Apr 16 16:30:19.922545 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:30:19.922505 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod739bf4b6_710a_4951_b918_2fc347aabf9a.slice/crio-36f618284bd355e8994146f9ee0d456aeaca1db8fc48fcd5073339fcbfaee718 WatchSource:0}: Error finding container 36f618284bd355e8994146f9ee0d456aeaca1db8fc48fcd5073339fcbfaee718: Status 404 returned error can't find the container with id 36f618284bd355e8994146f9ee0d456aeaca1db8fc48fcd5073339fcbfaee718 Apr 16 16:30:20.202869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:20.202830 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" event={"ID":"739bf4b6-710a-4951-b918-2fc347aabf9a","Type":"ContainerStarted","Data":"36f618284bd355e8994146f9ee0d456aeaca1db8fc48fcd5073339fcbfaee718"} Apr 16 16:30:24.218430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:24.218394 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" event={"ID":"739bf4b6-710a-4951-b918-2fc347aabf9a","Type":"ContainerStarted","Data":"9404065b7db4b79b1e5d18c03e31374ce53dd413c8e5e8f5606a5d4ed6d14115"} Apr 16 16:30:24.218809 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:24.218454 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:24.235516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:24.235469 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" podStartSLOduration=9.853379267 podStartE2EDuration="13.2354549s" podCreationTimestamp="2026-04-16 16:30:11 +0000 UTC" firstStartedPulling="2026-04-16 16:30:19.923865254 +0000 UTC m=+423.758718135" lastFinishedPulling="2026-04-16 16:30:23.305940887 +0000 UTC m=+427.140793768" observedRunningTime="2026-04-16 16:30:24.234224155 +0000 UTC m=+428.069077060" watchObservedRunningTime="2026-04-16 16:30:24.2354549 +0000 UTC m=+428.070307804" Apr 16 16:30:32.177983 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:32.177944 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-6z4gl" Apr 16 16:30:35.227234 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:35.227203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-4w487" Apr 16 16:30:40.362252 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.362210 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6d9946d7-vfltv"] Apr 16 16:30:40.365651 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.365624 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.378450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.378419 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6d9946d7-vfltv"] Apr 16 16:30:40.390804 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390763 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-service-ca\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-trusted-ca-bundle\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390874 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-oauth-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390910 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-console-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390944 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-oauth-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391075 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.390979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhljb\" (UniqueName: \"kubernetes.io/projected/df7489c2-d9fe-42c5-b256-842a98898a25-kube-api-access-nhljb\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.391501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.391103 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.491877 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.491840 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-oauth-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.491887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhljb\" (UniqueName: \"kubernetes.io/projected/df7489c2-d9fe-42c5-b256-842a98898a25-kube-api-access-nhljb\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.491931 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.491951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-service-ca\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.491966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-trusted-ca-bundle\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-oauth-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492047 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-console-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492817 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492773 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-console-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492817 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-service-ca\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492991 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-oauth-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.492991 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.492963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df7489c2-d9fe-42c5-b256-842a98898a25-trusted-ca-bundle\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.494529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.494501 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-serving-cert\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.494687 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.494666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/df7489c2-d9fe-42c5-b256-842a98898a25-console-oauth-config\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.502919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.502894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhljb\" (UniqueName: \"kubernetes.io/projected/df7489c2-d9fe-42c5-b256-842a98898a25-kube-api-access-nhljb\") pod \"console-5c6d9946d7-vfltv\" (UID: \"df7489c2-d9fe-42c5-b256-842a98898a25\") " pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.675798 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.675758 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:40.810143 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:40.810104 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6d9946d7-vfltv"] Apr 16 16:30:40.812364 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:30:40.812328 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf7489c2_d9fe_42c5_b256_842a98898a25.slice/crio-b56bb09bc30823bb3db00e3b24f8a6702a8b4275b52249b725281318a25f866d WatchSource:0}: Error finding container b56bb09bc30823bb3db00e3b24f8a6702a8b4275b52249b725281318a25f866d: Status 404 returned error can't find the container with id b56bb09bc30823bb3db00e3b24f8a6702a8b4275b52249b725281318a25f866d Apr 16 16:30:41.278166 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:41.278129 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d9946d7-vfltv" event={"ID":"df7489c2-d9fe-42c5-b256-842a98898a25","Type":"ContainerStarted","Data":"3f067301e6fed0984fcb9f098b1f90ba267db9796edd1b019ab614008b533700"} Apr 16 16:30:41.278166 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:41.278165 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6d9946d7-vfltv" event={"ID":"df7489c2-d9fe-42c5-b256-842a98898a25","Type":"ContainerStarted","Data":"b56bb09bc30823bb3db00e3b24f8a6702a8b4275b52249b725281318a25f866d"} Apr 16 16:30:41.296360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:41.296309 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6d9946d7-vfltv" podStartSLOduration=1.296293238 podStartE2EDuration="1.296293238s" podCreationTimestamp="2026-04-16 16:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:41.294276256 +0000 UTC m=+445.129129160" watchObservedRunningTime="2026-04-16 16:30:41.296293238 +0000 UTC m=+445.131146143" Apr 16 16:30:50.676463 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:50.676411 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:50.677039 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:50.676568 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:50.681623 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:50.681601 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:51.315837 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:51.315810 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c6d9946d7-vfltv" Apr 16 16:30:51.362585 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:30:51.362548 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:31:02.470133 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.470079 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7"] Apr 16 16:31:02.480538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.480512 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.482577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.482545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7"] Apr 16 16:31:02.483265 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.483222 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:31:02.483389 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.483240 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:31:02.483882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.483863 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:31:02.583408 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.583370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.583589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.583424 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkvmg\" (UniqueName: \"kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.583589 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.583525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.684862 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.684807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.685014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.684973 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkvmg\" (UniqueName: \"kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.685053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.685012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.685301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.685283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.685378 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.685364 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.692786 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.692764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkvmg\" (UniqueName: \"kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.791388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.791306 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:02.916569 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:02.916483 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7"] Apr 16 16:31:02.919223 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:02.919198 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955aea2e_4619_4b22_8de9_5cf3f59d9e34.slice/crio-41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c WatchSource:0}: Error finding container 41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c: Status 404 returned error can't find the container with id 41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c Apr 16 16:31:03.355474 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:03.355439 2571 generic.go:358] "Generic (PLEG): container finished" podID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerID="917246ae60daf06e9d59d869f4aff5b7de7002b0663ffaa324ce88f38dcd04c0" exitCode=0 Apr 16 16:31:03.355645 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:03.355486 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" event={"ID":"955aea2e-4619-4b22-8de9-5cf3f59d9e34","Type":"ContainerDied","Data":"917246ae60daf06e9d59d869f4aff5b7de7002b0663ffaa324ce88f38dcd04c0"} Apr 16 16:31:03.355645 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:03.355512 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" event={"ID":"955aea2e-4619-4b22-8de9-5cf3f59d9e34","Type":"ContainerStarted","Data":"41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c"} Apr 16 16:31:04.360929 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:04.360895 2571 generic.go:358] "Generic (PLEG): container finished" podID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerID="8fce67f551105858f1cfd8dcd9690b1785102e93bcbb87b5ef50d2d924f79347" exitCode=0 Apr 16 16:31:04.361348 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:04.360943 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" event={"ID":"955aea2e-4619-4b22-8de9-5cf3f59d9e34","Type":"ContainerDied","Data":"8fce67f551105858f1cfd8dcd9690b1785102e93bcbb87b5ef50d2d924f79347"} Apr 16 16:31:05.365795 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:05.365761 2571 generic.go:358] "Generic (PLEG): container finished" podID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerID="6d7027b21b845941cb85a6119e3dd342da112a5e090a9378e5ad22d15be824e3" exitCode=0 Apr 16 16:31:05.366187 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:05.365834 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" event={"ID":"955aea2e-4619-4b22-8de9-5cf3f59d9e34","Type":"ContainerDied","Data":"6d7027b21b845941cb85a6119e3dd342da112a5e090a9378e5ad22d15be824e3"} Apr 16 16:31:06.492088 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.492063 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:06.617972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.617932 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle\") pod \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " Apr 16 16:31:06.617972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.617979 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util\") pod \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " Apr 16 16:31:06.618209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.618036 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkvmg\" (UniqueName: \"kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg\") pod \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\" (UID: \"955aea2e-4619-4b22-8de9-5cf3f59d9e34\") " Apr 16 16:31:06.618605 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.618578 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle" (OuterVolumeSpecName: "bundle") pod "955aea2e-4619-4b22-8de9-5cf3f59d9e34" (UID: "955aea2e-4619-4b22-8de9-5cf3f59d9e34"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:06.620184 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.620154 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg" (OuterVolumeSpecName: "kube-api-access-hkvmg") pod "955aea2e-4619-4b22-8de9-5cf3f59d9e34" (UID: "955aea2e-4619-4b22-8de9-5cf3f59d9e34"). InnerVolumeSpecName "kube-api-access-hkvmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:06.623724 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.623658 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util" (OuterVolumeSpecName: "util") pod "955aea2e-4619-4b22-8de9-5cf3f59d9e34" (UID: "955aea2e-4619-4b22-8de9-5cf3f59d9e34"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:06.718994 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.718957 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:06.718994 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.718983 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955aea2e-4619-4b22-8de9-5cf3f59d9e34-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:06.718994 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:06.718993 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkvmg\" (UniqueName: \"kubernetes.io/projected/955aea2e-4619-4b22-8de9-5cf3f59d9e34-kube-api-access-hkvmg\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:07.375487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:07.375444 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" event={"ID":"955aea2e-4619-4b22-8de9-5cf3f59d9e34","Type":"ContainerDied","Data":"41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c"} Apr 16 16:31:07.375487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:07.375489 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e37855e08ae24364723dccd08cb5cac4588a01cd4dc7e8fa7d11fa65eacd7c" Apr 16 16:31:07.375487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:07.375462 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5hs6p7" Apr 16 16:31:14.801909 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.801871 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh"] Apr 16 16:31:14.802306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802279 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="pull" Apr 16 16:31:14.802306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802293 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="pull" Apr 16 16:31:14.802306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802307 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="util" Apr 16 16:31:14.802403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802314 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="util" Apr 16 16:31:14.802403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802327 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="extract" Apr 16 16:31:14.802403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802334 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="extract" Apr 16 16:31:14.802403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.802381 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="955aea2e-4619-4b22-8de9-5cf3f59d9e34" containerName="extract" Apr 16 16:31:14.806621 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.806605 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.809266 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.809238 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:31:14.809404 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.809377 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 16:31:14.809521 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.809503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-k8gcj\"" Apr 16 16:31:14.818645 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.818621 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh"] Apr 16 16:31:14.889217 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.889179 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.889394 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.889233 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5cb9\" (UniqueName: \"kubernetes.io/projected/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-kube-api-access-k5cb9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.990154 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.990097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.990319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.990236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5cb9\" (UniqueName: \"kubernetes.io/projected/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-kube-api-access-k5cb9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.990517 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.990496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:14.998554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:14.998523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5cb9\" (UniqueName: \"kubernetes.io/projected/f7d9d268-82d3-44f7-b7cd-b38b37ffc057-kube-api-access-k5cb9\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nznwh\" (UID: \"f7d9d268-82d3-44f7-b7cd-b38b37ffc057\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:15.115175 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:15.115071 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" Apr 16 16:31:15.243142 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:15.243085 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh"] Apr 16 16:31:15.246985 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:15.246952 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7d9d268_82d3_44f7_b7cd_b38b37ffc057.slice/crio-b93243feb8eeafef4fc068befa1b32e5e0b64e4dcaf2ebc1c18c39b7d36a5d91 WatchSource:0}: Error finding container b93243feb8eeafef4fc068befa1b32e5e0b64e4dcaf2ebc1c18c39b7d36a5d91: Status 404 returned error can't find the container with id b93243feb8eeafef4fc068befa1b32e5e0b64e4dcaf2ebc1c18c39b7d36a5d91 Apr 16 16:31:15.403155 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:15.403056 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" event={"ID":"f7d9d268-82d3-44f7-b7cd-b38b37ffc057","Type":"ContainerStarted","Data":"b93243feb8eeafef4fc068befa1b32e5e0b64e4dcaf2ebc1c18c39b7d36a5d91"} Apr 16 16:31:16.384822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.384774 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cd89987cd-p4kp6" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerName="console" containerID="cri-o://77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e" gracePeriod=15 Apr 16 16:31:16.620462 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.620428 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74"] Apr 16 16:31:16.624801 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.624777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.627761 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.627734 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:31:16.627880 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.627797 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:31:16.628678 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.628649 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:31:16.641730 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.641665 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74"] Apr 16 16:31:16.707330 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.707290 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.707330 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.707333 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.707550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.707355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc79g\" (UniqueName: \"kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.807959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.807922 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.808163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.807975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.808163 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.808007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc79g\" (UniqueName: \"kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.808381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.808352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.808507 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.808386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.816542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.816510 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc79g\" (UniqueName: \"kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:16.935042 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:16.935005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:17.116009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.115986 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cd89987cd-p4kp6_02016f6b-3a13-4224-98a2-f77e0c52261f/console/0.log" Apr 16 16:31:17.116165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.116050 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:31:17.211295 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211226 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211295 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211269 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211318 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211350 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211470 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xq7\" (UniqueName: \"kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211504 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211537 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert\") pod \"02016f6b-3a13-4224-98a2-f77e0c52261f\" (UID: \"02016f6b-3a13-4224-98a2-f77e0c52261f\") " Apr 16 16:31:17.211794 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.211728 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca" (OuterVolumeSpecName: "service-ca") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:17.212244 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.212096 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:17.212244 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.212202 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config" (OuterVolumeSpecName: "console-config") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:17.212407 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.212321 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:31:17.214550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.214516 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7" (OuterVolumeSpecName: "kube-api-access-h9xq7") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "kube-api-access-h9xq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:17.214550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.214540 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:17.214693 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.214668 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "02016f6b-3a13-4224-98a2-f77e0c52261f" (UID: "02016f6b-3a13-4224-98a2-f77e0c52261f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:17.228558 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.228534 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74"] Apr 16 16:31:17.230276 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:17.230253 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0043ea3_8fe5_4b61_87f7_2b8a697f6e7e.slice/crio-fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5 WatchSource:0}: Error finding container fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5: Status 404 returned error can't find the container with id fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5 Apr 16 16:31:17.312581 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312546 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9xq7\" (UniqueName: \"kubernetes.io/projected/02016f6b-3a13-4224-98a2-f77e0c52261f-kube-api-access-h9xq7\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312581 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312580 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-trusted-ca-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312596 2571 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312611 2571 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-service-ca\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312626 2571 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-oauth-serving-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312639 2571 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/02016f6b-3a13-4224-98a2-f77e0c52261f-console-oauth-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.312803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.312654 2571 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/02016f6b-3a13-4224-98a2-f77e0c52261f-console-config\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:17.412499 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412472 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cd89987cd-p4kp6_02016f6b-3a13-4224-98a2-f77e0c52261f/console/0.log" Apr 16 16:31:17.412946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412514 2571 generic.go:358] "Generic (PLEG): container finished" podID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerID="77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e" exitCode=2 Apr 16 16:31:17.412946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd89987cd-p4kp6" event={"ID":"02016f6b-3a13-4224-98a2-f77e0c52261f","Type":"ContainerDied","Data":"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e"} Apr 16 16:31:17.412946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412601 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cd89987cd-p4kp6" event={"ID":"02016f6b-3a13-4224-98a2-f77e0c52261f","Type":"ContainerDied","Data":"36643431bb9fca1c7a9cb7cf612a557555870894f72e7f7e62b50994cb744d76"} Apr 16 16:31:17.412946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412602 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cd89987cd-p4kp6" Apr 16 16:31:17.412946 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.412620 2571 scope.go:117] "RemoveContainer" containerID="77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e" Apr 16 16:31:17.414442 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.414407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" event={"ID":"f7d9d268-82d3-44f7-b7cd-b38b37ffc057","Type":"ContainerStarted","Data":"7e3a6535cdfe7da2184d0f1840f76196122c0d806eb7d9aa252dae982b12e80c"} Apr 16 16:31:17.415817 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.415794 2571 generic.go:358] "Generic (PLEG): container finished" podID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerID="b089aa9c70add8e2ab1bba7def4a9c28bf4bfe6d5000ef1afcf2cb851c575c00" exitCode=0 Apr 16 16:31:17.415931 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.415871 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" event={"ID":"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e","Type":"ContainerDied","Data":"b089aa9c70add8e2ab1bba7def4a9c28bf4bfe6d5000ef1afcf2cb851c575c00"} Apr 16 16:31:17.415931 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.415899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" event={"ID":"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e","Type":"ContainerStarted","Data":"fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5"} Apr 16 16:31:17.421360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.421338 2571 scope.go:117] "RemoveContainer" containerID="77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e" Apr 16 16:31:17.421621 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:31:17.421603 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e\": container with ID starting with 77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e not found: ID does not exist" containerID="77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e" Apr 16 16:31:17.421680 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.421631 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e"} err="failed to get container status \"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e\": rpc error: code = NotFound desc = could not find container \"77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e\": container with ID starting with 77b68bb0dd16d9ce9cb8a11a680323f96cc6eca3d7e6de917d03ae605b00694e not found: ID does not exist" Apr 16 16:31:17.487274 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.487169 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nznwh" podStartSLOduration=1.569817285 podStartE2EDuration="3.48715322s" podCreationTimestamp="2026-04-16 16:31:14 +0000 UTC" firstStartedPulling="2026-04-16 16:31:15.249590893 +0000 UTC m=+479.084443778" lastFinishedPulling="2026-04-16 16:31:17.166926808 +0000 UTC m=+481.001779713" observedRunningTime="2026-04-16 16:31:17.477002561 +0000 UTC m=+481.311855464" watchObservedRunningTime="2026-04-16 16:31:17.48715322 +0000 UTC m=+481.322006123" Apr 16 16:31:17.500963 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.500934 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:31:17.518017 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.517990 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cd89987cd-p4kp6"] Apr 16 16:31:17.576766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.576722 2571 patch_prober.go:28] interesting pod/console-cd89987cd-p4kp6 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.25:8443/health\": context deadline exceeded" start-of-body= Apr 16 16:31:17.576932 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:17.576784 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-cd89987cd-p4kp6" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerName="console" probeResult="failure" output="Get \"https://10.134.0.25:8443/health\": context deadline exceeded" Apr 16 16:31:18.756675 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:18.756635 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" path="/var/lib/kubelet/pods/02016f6b-3a13-4224-98a2-f77e0c52261f/volumes" Apr 16 16:31:20.434201 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:20.434161 2571 generic.go:358] "Generic (PLEG): container finished" podID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerID="a34454218438b970f1c5bb523f53cef5710fa3ed77176d8af6515d405ae0756e" exitCode=0 Apr 16 16:31:20.434556 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:20.434229 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" event={"ID":"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e","Type":"ContainerDied","Data":"a34454218438b970f1c5bb523f53cef5710fa3ed77176d8af6515d405ae0756e"} Apr 16 16:31:21.440010 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:21.439977 2571 generic.go:358] "Generic (PLEG): container finished" podID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerID="5ad3c99110197893c80c965790d1da3f51ae3c48f0152332e752f0da2310c4d8" exitCode=0 Apr 16 16:31:21.440422 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:21.440063 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" event={"ID":"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e","Type":"ContainerDied","Data":"5ad3c99110197893c80c965790d1da3f51ae3c48f0152332e752f0da2310c4d8"} Apr 16 16:31:22.451288 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.451256 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rzd9r"] Apr 16 16:31:22.451805 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.451786 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerName="console" Apr 16 16:31:22.451883 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.451808 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerName="console" Apr 16 16:31:22.451934 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.451910 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="02016f6b-3a13-4224-98a2-f77e0c52261f" containerName="console" Apr 16 16:31:22.454821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.454798 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.457403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.457376 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 16:31:22.457403 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.457393 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 16:31:22.458421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.458402 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-9s6xb\"" Apr 16 16:31:22.467005 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.466977 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rzd9r"] Apr 16 16:31:22.558943 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.558848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfb2f\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-kube-api-access-sfb2f\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.558943 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.558934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.571821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.571799 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:22.659456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659419 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util\") pod \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " Apr 16 16:31:22.659650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659521 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle\") pod \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " Apr 16 16:31:22.659650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659559 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc79g\" (UniqueName: \"kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g\") pod \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\" (UID: \"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e\") " Apr 16 16:31:22.659650 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfb2f\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-kube-api-access-sfb2f\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.659820 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.659967 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.659940 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle" (OuterVolumeSpecName: "bundle") pod "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" (UID: "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:22.661718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.661680 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g" (OuterVolumeSpecName: "kube-api-access-tc79g") pod "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" (UID: "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e"). InnerVolumeSpecName "kube-api-access-tc79g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:22.664367 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.664327 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util" (OuterVolumeSpecName: "util") pod "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" (UID: "f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:22.667538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.667513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.667625 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.667607 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfb2f\" (UniqueName: \"kubernetes.io/projected/1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55-kube-api-access-sfb2f\") pod \"cert-manager-cainjector-8966b78d4-rzd9r\" (UID: \"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.760306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.760220 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:22.760306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.760245 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc79g\" (UniqueName: \"kubernetes.io/projected/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-kube-api-access-tc79g\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:22.760306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.760256 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:22.775157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.775103 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" Apr 16 16:31:22.899362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:22.899285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rzd9r"] Apr 16 16:31:22.901624 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:22.901591 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b8d3534_d5f5_4018_8d49_9bdf0ee4cd55.slice/crio-1b4d610bced5bf45b562e9a2c25138b9139f62e70b6b03ca753c28e1c79a907f WatchSource:0}: Error finding container 1b4d610bced5bf45b562e9a2c25138b9139f62e70b6b03ca753c28e1c79a907f: Status 404 returned error can't find the container with id 1b4d610bced5bf45b562e9a2c25138b9139f62e70b6b03ca753c28e1c79a907f Apr 16 16:31:23.447815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:23.447781 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" event={"ID":"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55","Type":"ContainerStarted","Data":"1b4d610bced5bf45b562e9a2c25138b9139f62e70b6b03ca753c28e1c79a907f"} Apr 16 16:31:23.449336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:23.449310 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" event={"ID":"f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e","Type":"ContainerDied","Data":"fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5"} Apr 16 16:31:23.449336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:23.449339 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa6d0278a95608a7392ffd0c40cd1c0a5a310afcf638efb7be9ffef991e1edf5" Apr 16 16:31:23.449500 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:23.449373 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhgx74" Apr 16 16:31:26.462052 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:26.462016 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" event={"ID":"1b8d3534-d5f5-4018-8d49-9bdf0ee4cd55","Type":"ContainerStarted","Data":"f4cb989859e4c553e8e755569d1596fb8cfdd916b9b0d06d07ff54d1c6e21d1c"} Apr 16 16:31:26.482288 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:26.482239 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-rzd9r" podStartSLOduration=1.582403456 podStartE2EDuration="4.482222865s" podCreationTimestamp="2026-04-16 16:31:22 +0000 UTC" firstStartedPulling="2026-04-16 16:31:22.903695867 +0000 UTC m=+486.738548751" lastFinishedPulling="2026-04-16 16:31:25.803515264 +0000 UTC m=+489.638368160" observedRunningTime="2026-04-16 16:31:26.48124188 +0000 UTC m=+490.316094785" watchObservedRunningTime="2026-04-16 16:31:26.482222865 +0000 UTC m=+490.317075768" Apr 16 16:31:29.626848 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.626810 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt"] Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627174 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="extract" Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627186 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="extract" Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627195 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="util" Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627202 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="util" Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627209 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="pull" Apr 16 16:31:29.627251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627215 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="pull" Apr 16 16:31:29.627451 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.627270 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0043ea3-8fe5-4b61-87f7-2b8a697f6e7e" containerName="extract" Apr 16 16:31:29.631383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.631361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.633659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.633627 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-2j4l7\"" Apr 16 16:31:29.633793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.633658 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:31:29.634432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.634416 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:31:29.638308 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.638283 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt"] Apr 16 16:31:29.723865 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.723824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d325c16d-746b-4cdf-94e2-979b7831c5d3-tmp\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.723865 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.723865 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtz7\" (UniqueName: \"kubernetes.io/projected/d325c16d-746b-4cdf-94e2-979b7831c5d3-kube-api-access-bgtz7\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.825020 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.824985 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d325c16d-746b-4cdf-94e2-979b7831c5d3-tmp\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.825216 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.825024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtz7\" (UniqueName: \"kubernetes.io/projected/d325c16d-746b-4cdf-94e2-979b7831c5d3-kube-api-access-bgtz7\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.825544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.825525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d325c16d-746b-4cdf-94e2-979b7831c5d3-tmp\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.833469 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.833445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtz7\" (UniqueName: \"kubernetes.io/projected/d325c16d-746b-4cdf-94e2-979b7831c5d3-kube-api-access-bgtz7\") pod \"openshift-lws-operator-bfc7f696d-k5pnt\" (UID: \"d325c16d-746b-4cdf-94e2-979b7831c5d3\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:29.941321 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:29.941290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" Apr 16 16:31:30.064705 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:30.064681 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt"] Apr 16 16:31:30.067175 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:30.067146 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd325c16d_746b_4cdf_94e2_979b7831c5d3.slice/crio-98106cbe0fa0bff1209857f9d3513b0a841879ee72a4c2e4effa05181f0e7e6a WatchSource:0}: Error finding container 98106cbe0fa0bff1209857f9d3513b0a841879ee72a4c2e4effa05181f0e7e6a: Status 404 returned error can't find the container with id 98106cbe0fa0bff1209857f9d3513b0a841879ee72a4c2e4effa05181f0e7e6a Apr 16 16:31:30.483090 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:30.483052 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" event={"ID":"d325c16d-746b-4cdf-94e2-979b7831c5d3","Type":"ContainerStarted","Data":"98106cbe0fa0bff1209857f9d3513b0a841879ee72a4c2e4effa05181f0e7e6a"} Apr 16 16:31:32.491643 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:32.491602 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" event={"ID":"d325c16d-746b-4cdf-94e2-979b7831c5d3","Type":"ContainerStarted","Data":"0c3d892ffb8a517147964f56d9d9651b5c917cf18ea2b09145234a68e9ae1ef9"} Apr 16 16:31:32.509997 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:32.509932 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-k5pnt" podStartSLOduration=1.669757414 podStartE2EDuration="3.50991391s" podCreationTimestamp="2026-04-16 16:31:29 +0000 UTC" firstStartedPulling="2026-04-16 16:31:30.068807881 +0000 UTC m=+493.903660777" lastFinishedPulling="2026-04-16 16:31:31.908964387 +0000 UTC m=+495.743817273" observedRunningTime="2026-04-16 16:31:32.508959785 +0000 UTC m=+496.343812688" watchObservedRunningTime="2026-04-16 16:31:32.50991391 +0000 UTC m=+496.344766815" Apr 16 16:31:39.204200 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.204163 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp"] Apr 16 16:31:39.207785 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.207767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.210273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.210245 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:31:39.210273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.210269 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:31:39.210440 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.210250 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:31:39.215573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.215550 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp"] Apr 16 16:31:39.304445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.304412 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.304445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.304449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.304661 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.304607 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv2cb\" (UniqueName: \"kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.405373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.405330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.405373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.405378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.405576 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.405463 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv2cb\" (UniqueName: \"kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.405776 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.405755 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.405817 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.405780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.414585 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.414550 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv2cb\" (UniqueName: \"kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.521024 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.520925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:39.647008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:39.646977 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp"] Apr 16 16:31:39.648873 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:39.648845 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53803c75_655f_477d_9554_ca1fe2342880.slice/crio-9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2 WatchSource:0}: Error finding container 9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2: Status 404 returned error can't find the container with id 9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2 Apr 16 16:31:40.527260 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:40.527223 2571 generic.go:358] "Generic (PLEG): container finished" podID="53803c75-655f-477d-9554-ca1fe2342880" containerID="a5ddad9dbc88345432ba04f4c6ff8ba6a74b177a94b5f8e41ad9b26e10a5368b" exitCode=0 Apr 16 16:31:40.527604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:40.527311 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerDied","Data":"a5ddad9dbc88345432ba04f4c6ff8ba6a74b177a94b5f8e41ad9b26e10a5368b"} Apr 16 16:31:40.527604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:40.527348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerStarted","Data":"9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2"} Apr 16 16:31:41.533226 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:41.533192 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerStarted","Data":"7edaa858de97d459c7a5fb89a35f94e53374d38d10cd91313de6ee3156088d95"} Apr 16 16:31:42.538201 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:42.538166 2571 generic.go:358] "Generic (PLEG): container finished" podID="53803c75-655f-477d-9554-ca1fe2342880" containerID="7edaa858de97d459c7a5fb89a35f94e53374d38d10cd91313de6ee3156088d95" exitCode=0 Apr 16 16:31:42.538674 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:42.538252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerDied","Data":"7edaa858de97d459c7a5fb89a35f94e53374d38d10cd91313de6ee3156088d95"} Apr 16 16:31:43.543520 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:43.543488 2571 generic.go:358] "Generic (PLEG): container finished" podID="53803c75-655f-477d-9554-ca1fe2342880" containerID="b4479ceb0e149c6a446d2d104d23830ac471a7a5d75950f61ee286b3f5d18220" exitCode=0 Apr 16 16:31:43.543862 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:43.543578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerDied","Data":"b4479ceb0e149c6a446d2d104d23830ac471a7a5d75950f61ee286b3f5d18220"} Apr 16 16:31:44.669655 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.669625 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:44.743334 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.743295 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv2cb\" (UniqueName: \"kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb\") pod \"53803c75-655f-477d-9554-ca1fe2342880\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " Apr 16 16:31:44.743542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.743347 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle\") pod \"53803c75-655f-477d-9554-ca1fe2342880\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " Apr 16 16:31:44.743542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.743387 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util\") pod \"53803c75-655f-477d-9554-ca1fe2342880\" (UID: \"53803c75-655f-477d-9554-ca1fe2342880\") " Apr 16 16:31:44.744301 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.744274 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle" (OuterVolumeSpecName: "bundle") pod "53803c75-655f-477d-9554-ca1fe2342880" (UID: "53803c75-655f-477d-9554-ca1fe2342880"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:44.745544 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.745517 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb" (OuterVolumeSpecName: "kube-api-access-nv2cb") pod "53803c75-655f-477d-9554-ca1fe2342880" (UID: "53803c75-655f-477d-9554-ca1fe2342880"). InnerVolumeSpecName "kube-api-access-nv2cb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:44.749108 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.749083 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util" (OuterVolumeSpecName: "util") pod "53803c75-655f-477d-9554-ca1fe2342880" (UID: "53803c75-655f-477d-9554-ca1fe2342880"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:44.844980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.844874 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nv2cb\" (UniqueName: \"kubernetes.io/projected/53803c75-655f-477d-9554-ca1fe2342880-kube-api-access-nv2cb\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:44.844980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.844911 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:44.844980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:44.844928 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53803c75-655f-477d-9554-ca1fe2342880-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:45.553133 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:45.553082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" event={"ID":"53803c75-655f-477d-9554-ca1fe2342880","Type":"ContainerDied","Data":"9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2"} Apr 16 16:31:45.553133 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:45.553130 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835fsswp" Apr 16 16:31:45.553348 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:45.553151 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd506c77a2701fe3a86ade9806af84d1f131cf658addb178d9da982a554e2a2" Apr 16 16:31:47.350628 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.350595 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-dp7jj"] Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.350978 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="pull" Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.350992 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="pull" Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.351007 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="extract" Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.351012 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="extract" Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.351024 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="util" Apr 16 16:31:47.351030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.351029 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="util" Apr 16 16:31:47.351257 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.351090 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53803c75-655f-477d-9554-ca1fe2342880" containerName="extract" Apr 16 16:31:47.355273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.355250 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.358537 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.358517 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:31:47.359318 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.359294 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:31:47.359443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.359294 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:31:47.359443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.359369 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lcdzz\"" Apr 16 16:31:47.368007 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.367983 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.368150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.368035 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnjsq\" (UniqueName: \"kubernetes.io/projected/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-kube-api-access-cnjsq\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.368150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.368071 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-manager-config\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.368150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.368095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-metrics-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.371000 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.370970 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-dp7jj"] Apr 16 16:31:47.469001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.468970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-manager-config\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.469202 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.469017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-metrics-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.469202 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.469070 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.469202 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.469137 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnjsq\" (UniqueName: \"kubernetes.io/projected/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-kube-api-access-cnjsq\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.469798 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.469775 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-manager-config\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.471660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.471634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-metrics-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.471752 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.471720 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-cert\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.490466 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.490430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnjsq\" (UniqueName: \"kubernetes.io/projected/fdb886ae-75bd-4c87-9ef8-29bcf06d0306-kube-api-access-cnjsq\") pod \"lws-controller-manager-846585b969-dp7jj\" (UID: \"fdb886ae-75bd-4c87-9ef8-29bcf06d0306\") " pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.664386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.664350 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:47.805728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:47.805695 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-846585b969-dp7jj"] Apr 16 16:31:47.808603 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:47.808570 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdb886ae_75bd_4c87_9ef8_29bcf06d0306.slice/crio-bf54fb0ee161742d9607d37de59a049250c1725b57a7632c2ba1a379d1a329c4 WatchSource:0}: Error finding container bf54fb0ee161742d9607d37de59a049250c1725b57a7632c2ba1a379d1a329c4: Status 404 returned error can't find the container with id bf54fb0ee161742d9607d37de59a049250c1725b57a7632c2ba1a379d1a329c4 Apr 16 16:31:48.565757 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:48.565722 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" event={"ID":"fdb886ae-75bd-4c87-9ef8-29bcf06d0306","Type":"ContainerStarted","Data":"bf54fb0ee161742d9607d37de59a049250c1725b57a7632c2ba1a379d1a329c4"} Apr 16 16:31:50.575551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:50.575515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" event={"ID":"fdb886ae-75bd-4c87-9ef8-29bcf06d0306","Type":"ContainerStarted","Data":"da3f268e0851c3acfe521b9fa152fa8815f168ea48a50063e5c3fad575f64804"} Apr 16 16:31:50.575916 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:50.575577 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:31:50.594992 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:50.594922 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" podStartSLOduration=1.604915243 podStartE2EDuration="3.594903369s" podCreationTimestamp="2026-04-16 16:31:47 +0000 UTC" firstStartedPulling="2026-04-16 16:31:47.810518247 +0000 UTC m=+511.645371129" lastFinishedPulling="2026-04-16 16:31:49.800506354 +0000 UTC m=+513.635359255" observedRunningTime="2026-04-16 16:31:50.594487095 +0000 UTC m=+514.429340028" watchObservedRunningTime="2026-04-16 16:31:50.594903369 +0000 UTC m=+514.429756277" Apr 16 16:31:53.977108 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.977074 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp"] Apr 16 16:31:53.980417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.980398 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:53.991635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.991612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:31:53.991768 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.991614 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:31:53.992466 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.992451 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:31:53.996857 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:53.996832 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp"] Apr 16 16:31:54.028356 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.028317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.028521 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.028414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28r54\" (UniqueName: \"kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.028521 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.028485 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.128941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.128905 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.129144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.128963 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.129144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.129006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28r54\" (UniqueName: \"kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.129321 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.129303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.129362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.129325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.168162 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.168094 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28r54\" (UniqueName: \"kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.289485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.289400 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:54.441070 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.441042 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp"] Apr 16 16:31:54.443089 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:31:54.443058 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba1e987_8c97_4517_a5d2_065b68ffa5eb.slice/crio-8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8 WatchSource:0}: Error finding container 8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8: Status 404 returned error can't find the container with id 8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8 Apr 16 16:31:54.591368 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.591328 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerID="4b2379fe428252cd1d7e8367532cf26e01b905fc35484adbed06287e6340da14" exitCode=0 Apr 16 16:31:54.591509 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.591410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" event={"ID":"5ba1e987-8c97-4517-a5d2-065b68ffa5eb","Type":"ContainerDied","Data":"4b2379fe428252cd1d7e8367532cf26e01b905fc35484adbed06287e6340da14"} Apr 16 16:31:54.591509 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:54.591449 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" event={"ID":"5ba1e987-8c97-4517-a5d2-065b68ffa5eb","Type":"ContainerStarted","Data":"8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8"} Apr 16 16:31:56.600067 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:56.600035 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerID="92056bfe2670e632bf133ea50b7827bd99cfec3241cfea17665eb703c8ceb04c" exitCode=0 Apr 16 16:31:56.600488 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:56.600145 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" event={"ID":"5ba1e987-8c97-4517-a5d2-065b68ffa5eb","Type":"ContainerDied","Data":"92056bfe2670e632bf133ea50b7827bd99cfec3241cfea17665eb703c8ceb04c"} Apr 16 16:31:57.606675 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:57.606643 2571 generic.go:358] "Generic (PLEG): container finished" podID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerID="f7fc4e23a8d6d74dffe58dae1e97a7c9d7417e0699d6bc3a94e821053ccd5cea" exitCode=0 Apr 16 16:31:57.607055 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:57.606682 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" event={"ID":"5ba1e987-8c97-4517-a5d2-065b68ffa5eb","Type":"ContainerDied","Data":"f7fc4e23a8d6d74dffe58dae1e97a7c9d7417e0699d6bc3a94e821053ccd5cea"} Apr 16 16:31:58.737554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.737529 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:31:58.770956 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.770921 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util\") pod \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " Apr 16 16:31:58.771146 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.771001 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle\") pod \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " Apr 16 16:31:58.771146 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.771080 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28r54\" (UniqueName: \"kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54\") pod \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\" (UID: \"5ba1e987-8c97-4517-a5d2-065b68ffa5eb\") " Apr 16 16:31:58.772090 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.772059 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle" (OuterVolumeSpecName: "bundle") pod "5ba1e987-8c97-4517-a5d2-065b68ffa5eb" (UID: "5ba1e987-8c97-4517-a5d2-065b68ffa5eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:58.773103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.773077 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54" (OuterVolumeSpecName: "kube-api-access-28r54") pod "5ba1e987-8c97-4517-a5d2-065b68ffa5eb" (UID: "5ba1e987-8c97-4517-a5d2-065b68ffa5eb"). InnerVolumeSpecName "kube-api-access-28r54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:58.779088 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.779033 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util" (OuterVolumeSpecName: "util") pod "5ba1e987-8c97-4517-a5d2-065b68ffa5eb" (UID: "5ba1e987-8c97-4517-a5d2-065b68ffa5eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:58.872243 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.872156 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28r54\" (UniqueName: \"kubernetes.io/projected/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-kube-api-access-28r54\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:58.872243 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.872189 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:58.872243 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:58.872199 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5ba1e987-8c97-4517-a5d2-065b68ffa5eb-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:31:59.615441 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:59.615411 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" event={"ID":"5ba1e987-8c97-4517-a5d2-065b68ffa5eb","Type":"ContainerDied","Data":"8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8"} Apr 16 16:31:59.615441 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:59.615445 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8480be4ece4bd51c953aa1cb29619e17fcc7bb56dfb46542914a023c036adfa8" Apr 16 16:31:59.615648 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:31:59.615473 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2cwjxp" Apr 16 16:32:01.581493 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:01.581462 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-846585b969-dp7jj" Apr 16 16:32:23.298200 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298164 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq"] Apr 16 16:32:23.298547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298516 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="pull" Apr 16 16:32:23.298547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298526 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="pull" Apr 16 16:32:23.298547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298538 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="util" Apr 16 16:32:23.298547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298543 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="util" Apr 16 16:32:23.298682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298554 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="extract" Apr 16 16:32:23.298682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298559 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="extract" Apr 16 16:32:23.298682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.298614 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ba1e987-8c97-4517-a5d2-065b68ffa5eb" containerName="extract" Apr 16 16:32:23.304392 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.304368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.306964 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.306941 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-f4z7t\"" Apr 16 16:32:23.307345 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.307322 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:32:23.307836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.307818 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:32:23.312042 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.312019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq"] Apr 16 16:32:23.391861 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.391830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.392083 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.391893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nlx\" (UniqueName: \"kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.392083 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.391987 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.397916 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.397888 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch"] Apr 16 16:32:23.402624 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.402606 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.418711 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.418683 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch"] Apr 16 16:32:23.493213 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.493213 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493215 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.493447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ff9v\" (UniqueName: \"kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.493447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.493447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493343 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69nlx\" (UniqueName: \"kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.493447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493397 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.493745 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493724 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.493787 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.493740 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.505902 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.505867 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nlx\" (UniqueName: \"kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx\") pod \"d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.558400 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.558316 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6"] Apr 16 16:32:23.562993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.562974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.571588 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.571562 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6"] Apr 16 16:32:23.593965 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.593924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.593965 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.593968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.594255 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.594025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ff9v\" (UniqueName: \"kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.594352 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.594327 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.594408 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.594394 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.612266 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.612230 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ff9v\" (UniqueName: \"kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v\") pod \"309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.614251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.614230 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg"] Apr 16 16:32:23.615864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.615842 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:23.618792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.618770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.628584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.628558 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg"] Apr 16 16:32:23.695498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.695468 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.695652 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.695548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4grt\" (UniqueName: \"kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.695652 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.695580 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.711378 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.711343 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:23.756005 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.755980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq"] Apr 16 16:32:23.796389 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796357 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.796477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796413 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.796536 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796493 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vsn2\" (UniqueName: \"kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.796536 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4grt\" (UniqueName: \"kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.796632 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.796632 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796576 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.798844 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.798844 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.796804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.806646 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.806604 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4grt\" (UniqueName: \"kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt\") pod \"ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.853944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.853918 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch"] Apr 16 16:32:23.856314 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:32:23.856275 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d44bae_1364_44e0_a0cb_b44e76100441.slice/crio-e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c WatchSource:0}: Error finding container e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c: Status 404 returned error can't find the container with id e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c Apr 16 16:32:23.873746 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.873723 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:23.897777 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.897743 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.897919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.897836 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vsn2\" (UniqueName: \"kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.897919 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.897902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.898189 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.898166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.898240 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.898196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.907305 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.907278 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vsn2\" (UniqueName: \"kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2\") pod \"5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:23.945985 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:23.945952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:24.006375 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.006348 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6"] Apr 16 16:32:24.008942 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:32:24.008887 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb6975d_f4e3_40ad_a4c5_0c2bc107d860.slice/crio-91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac WatchSource:0}: Error finding container 91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac: Status 404 returned error can't find the container with id 91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac Apr 16 16:32:24.083394 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.083366 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg"] Apr 16 16:32:24.084898 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:32:24.084872 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e1a030_9d0e_481d_9004_06b318d00418.slice/crio-88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b WatchSource:0}: Error finding container 88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b: Status 404 returned error can't find the container with id 88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b Apr 16 16:32:24.716836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.716804 2571 generic.go:358] "Generic (PLEG): container finished" podID="c0e1a030-9d0e-481d-9004-06b318d00418" containerID="6349b3f25e9c9b7fe43e569df8c38767b592d93980cc777620bbf2b53a26e820" exitCode=0 Apr 16 16:32:24.717351 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.716864 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" event={"ID":"c0e1a030-9d0e-481d-9004-06b318d00418","Type":"ContainerDied","Data":"6349b3f25e9c9b7fe43e569df8c38767b592d93980cc777620bbf2b53a26e820"} Apr 16 16:32:24.717351 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.716886 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" event={"ID":"c0e1a030-9d0e-481d-9004-06b318d00418","Type":"ContainerStarted","Data":"88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b"} Apr 16 16:32:24.718273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.718237 2571 generic.go:358] "Generic (PLEG): container finished" podID="06d44bae-1364-44e0-a0cb-b44e76100441" containerID="7b2ccc4525bfd8c3269b5f97f29fbe10342650c0397048b2f92091ab6361747d" exitCode=0 Apr 16 16:32:24.718391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.718312 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" event={"ID":"06d44bae-1364-44e0-a0cb-b44e76100441","Type":"ContainerDied","Data":"7b2ccc4525bfd8c3269b5f97f29fbe10342650c0397048b2f92091ab6361747d"} Apr 16 16:32:24.718391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.718339 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" event={"ID":"06d44bae-1364-44e0-a0cb-b44e76100441","Type":"ContainerStarted","Data":"e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c"} Apr 16 16:32:24.719856 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.719838 2571 generic.go:358] "Generic (PLEG): container finished" podID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerID="803dbe5099345385f17cd728cc7dcc53786a3f84068fb6cae8f6bbd0ca10dc03" exitCode=0 Apr 16 16:32:24.719949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.719897 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" event={"ID":"a132f0c6-3ca4-4444-9416-077465e4b4d8","Type":"ContainerDied","Data":"803dbe5099345385f17cd728cc7dcc53786a3f84068fb6cae8f6bbd0ca10dc03"} Apr 16 16:32:24.719949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.719924 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" event={"ID":"a132f0c6-3ca4-4444-9416-077465e4b4d8","Type":"ContainerStarted","Data":"36df09338f74266d333fb9e840b7e7b90e4efb9bebfd0b5821b5dbc06045babe"} Apr 16 16:32:24.721239 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.721220 2571 generic.go:358] "Generic (PLEG): container finished" podID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerID="d23cede6a5a41079a84065745824f118214e0f3ca2831573ccf8bd16233293a7" exitCode=0 Apr 16 16:32:24.721328 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.721304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" event={"ID":"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860","Type":"ContainerDied","Data":"d23cede6a5a41079a84065745824f118214e0f3ca2831573ccf8bd16233293a7"} Apr 16 16:32:24.721372 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:24.721330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" event={"ID":"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860","Type":"ContainerStarted","Data":"91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac"} Apr 16 16:32:25.728369 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:25.728322 2571 generic.go:358] "Generic (PLEG): container finished" podID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerID="bba39abd4d44361b0ed74688b12aa663eb361bbc5587737e451b624b3eb89bbf" exitCode=0 Apr 16 16:32:25.728817 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:25.728390 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" event={"ID":"a132f0c6-3ca4-4444-9416-077465e4b4d8","Type":"ContainerDied","Data":"bba39abd4d44361b0ed74688b12aa663eb361bbc5587737e451b624b3eb89bbf"} Apr 16 16:32:26.734381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.734347 2571 generic.go:358] "Generic (PLEG): container finished" podID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerID="2a6e252d6f488bb9c01eea55d665dbd959fd55ccfc67f923007d2fb183f8d1fb" exitCode=0 Apr 16 16:32:26.734785 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.734428 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" event={"ID":"a132f0c6-3ca4-4444-9416-077465e4b4d8","Type":"ContainerDied","Data":"2a6e252d6f488bb9c01eea55d665dbd959fd55ccfc67f923007d2fb183f8d1fb"} Apr 16 16:32:26.736022 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.736000 2571 generic.go:358] "Generic (PLEG): container finished" podID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerID="824e9b4e8b8aea28b66a2f8fdfb340ebd4849db73d548bb91500b0289a013436" exitCode=0 Apr 16 16:32:26.736140 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.736090 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" event={"ID":"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860","Type":"ContainerDied","Data":"824e9b4e8b8aea28b66a2f8fdfb340ebd4849db73d548bb91500b0289a013436"} Apr 16 16:32:26.737936 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.737914 2571 generic.go:358] "Generic (PLEG): container finished" podID="c0e1a030-9d0e-481d-9004-06b318d00418" containerID="e9d0c0e6469f36fd10aad141168415261ebc2a9ef399dde748af14cb1eb43110" exitCode=0 Apr 16 16:32:26.738040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.737974 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" event={"ID":"c0e1a030-9d0e-481d-9004-06b318d00418","Type":"ContainerDied","Data":"e9d0c0e6469f36fd10aad141168415261ebc2a9ef399dde748af14cb1eb43110"} Apr 16 16:32:26.739726 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.739707 2571 generic.go:358] "Generic (PLEG): container finished" podID="06d44bae-1364-44e0-a0cb-b44e76100441" containerID="45627e80612604ca7c7fcf0881ff2b33544d2cb1c87c0256bd6bf0e1cb959239" exitCode=0 Apr 16 16:32:26.739816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:26.739738 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" event={"ID":"06d44bae-1364-44e0-a0cb-b44e76100441","Type":"ContainerDied","Data":"45627e80612604ca7c7fcf0881ff2b33544d2cb1c87c0256bd6bf0e1cb959239"} Apr 16 16:32:27.745756 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.745723 2571 generic.go:358] "Generic (PLEG): container finished" podID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerID="f5a9479065d03a98f698ae8e87f83e9f2aa969896c6754a290704bfe55efe2c4" exitCode=0 Apr 16 16:32:27.746202 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.745763 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" event={"ID":"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860","Type":"ContainerDied","Data":"f5a9479065d03a98f698ae8e87f83e9f2aa969896c6754a290704bfe55efe2c4"} Apr 16 16:32:27.747780 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.747758 2571 generic.go:358] "Generic (PLEG): container finished" podID="c0e1a030-9d0e-481d-9004-06b318d00418" containerID="50a8ad980a4e2a5f86e74f9d3baf7d3712b4c013081b9e5449fd65fe64790701" exitCode=0 Apr 16 16:32:27.747882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.747801 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" event={"ID":"c0e1a030-9d0e-481d-9004-06b318d00418","Type":"ContainerDied","Data":"50a8ad980a4e2a5f86e74f9d3baf7d3712b4c013081b9e5449fd65fe64790701"} Apr 16 16:32:27.749685 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.749655 2571 generic.go:358] "Generic (PLEG): container finished" podID="06d44bae-1364-44e0-a0cb-b44e76100441" containerID="5e8a6e6fe3c4b56a9790b8b0eeae15d41ba3e2660a5cc1189efac121f6e00afd" exitCode=0 Apr 16 16:32:27.749685 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.749679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" event={"ID":"06d44bae-1364-44e0-a0cb-b44e76100441","Type":"ContainerDied","Data":"5e8a6e6fe3c4b56a9790b8b0eeae15d41ba3e2660a5cc1189efac121f6e00afd"} Apr 16 16:32:27.880001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:27.879978 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:28.034599 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.034562 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69nlx\" (UniqueName: \"kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx\") pod \"a132f0c6-3ca4-4444-9416-077465e4b4d8\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " Apr 16 16:32:28.034760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.034692 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle\") pod \"a132f0c6-3ca4-4444-9416-077465e4b4d8\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " Apr 16 16:32:28.034760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.034714 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util\") pod \"a132f0c6-3ca4-4444-9416-077465e4b4d8\" (UID: \"a132f0c6-3ca4-4444-9416-077465e4b4d8\") " Apr 16 16:32:28.035226 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.035198 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle" (OuterVolumeSpecName: "bundle") pod "a132f0c6-3ca4-4444-9416-077465e4b4d8" (UID: "a132f0c6-3ca4-4444-9416-077465e4b4d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:28.036616 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.036595 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx" (OuterVolumeSpecName: "kube-api-access-69nlx") pod "a132f0c6-3ca4-4444-9416-077465e4b4d8" (UID: "a132f0c6-3ca4-4444-9416-077465e4b4d8"). InnerVolumeSpecName "kube-api-access-69nlx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:32:28.041424 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.041392 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util" (OuterVolumeSpecName: "util") pod "a132f0c6-3ca4-4444-9416-077465e4b4d8" (UID: "a132f0c6-3ca4-4444-9416-077465e4b4d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:28.135573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.135495 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69nlx\" (UniqueName: \"kubernetes.io/projected/a132f0c6-3ca4-4444-9416-077465e4b4d8-kube-api-access-69nlx\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:28.135573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.135523 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:28.135573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.135532 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a132f0c6-3ca4-4444-9416-077465e4b4d8-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:28.755412 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.755373 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" event={"ID":"a132f0c6-3ca4-4444-9416-077465e4b4d8","Type":"ContainerDied","Data":"36df09338f74266d333fb9e840b7e7b90e4efb9bebfd0b5821b5dbc06045babe"} Apr 16 16:32:28.755412 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.755410 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36df09338f74266d333fb9e840b7e7b90e4efb9bebfd0b5821b5dbc06045babe" Apr 16 16:32:28.756274 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.755983 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d54990151e6a3040f48962708f776cca7120e97625eb3a76d6fde2767bbk6jq" Apr 16 16:32:28.905356 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.905328 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:28.943439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.943413 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:28.952856 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:28.952832 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:29.043878 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043783 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util\") pod \"06d44bae-1364-44e0-a0cb-b44e76100441\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " Apr 16 16:32:29.043878 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043838 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4grt\" (UniqueName: \"kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt\") pod \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " Apr 16 16:32:29.043878 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043858 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle\") pod \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " Apr 16 16:32:29.044197 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043886 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util\") pod \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\" (UID: \"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860\") " Apr 16 16:32:29.044197 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043922 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ff9v\" (UniqueName: \"kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v\") pod \"06d44bae-1364-44e0-a0cb-b44e76100441\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " Apr 16 16:32:29.044197 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.043969 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle\") pod \"06d44bae-1364-44e0-a0cb-b44e76100441\" (UID: \"06d44bae-1364-44e0-a0cb-b44e76100441\") " Apr 16 16:32:29.044704 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.044464 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle" (OuterVolumeSpecName: "bundle") pod "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" (UID: "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.044704 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.044605 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle" (OuterVolumeSpecName: "bundle") pod "06d44bae-1364-44e0-a0cb-b44e76100441" (UID: "06d44bae-1364-44e0-a0cb-b44e76100441"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.046304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.046277 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt" (OuterVolumeSpecName: "kube-api-access-x4grt") pod "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" (UID: "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860"). InnerVolumeSpecName "kube-api-access-x4grt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:32:29.046460 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.046441 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v" (OuterVolumeSpecName: "kube-api-access-8ff9v") pod "06d44bae-1364-44e0-a0cb-b44e76100441" (UID: "06d44bae-1364-44e0-a0cb-b44e76100441"). InnerVolumeSpecName "kube-api-access-8ff9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:32:29.050107 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.050085 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util" (OuterVolumeSpecName: "util") pod "06d44bae-1364-44e0-a0cb-b44e76100441" (UID: "06d44bae-1364-44e0-a0cb-b44e76100441"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.050174 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.050112 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util" (OuterVolumeSpecName: "util") pod "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" (UID: "bfb6975d-f4e3-40ad-a4c5-0c2bc107d860"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.144759 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.144699 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vsn2\" (UniqueName: \"kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2\") pod \"c0e1a030-9d0e-481d-9004-06b318d00418\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " Apr 16 16:32:29.144973 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.144779 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util\") pod \"c0e1a030-9d0e-481d-9004-06b318d00418\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " Apr 16 16:32:29.144973 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.144884 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle\") pod \"c0e1a030-9d0e-481d-9004-06b318d00418\" (UID: \"c0e1a030-9d0e-481d-9004-06b318d00418\") " Apr 16 16:32:29.145188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145171 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145195 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x4grt\" (UniqueName: \"kubernetes.io/projected/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-kube-api-access-x4grt\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145209 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145223 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bfb6975d-f4e3-40ad-a4c5-0c2bc107d860-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145237 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ff9v\" (UniqueName: \"kubernetes.io/projected/06d44bae-1364-44e0-a0cb-b44e76100441-kube-api-access-8ff9v\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145252 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06d44bae-1364-44e0-a0cb-b44e76100441-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.145496 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.145471 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle" (OuterVolumeSpecName: "bundle") pod "c0e1a030-9d0e-481d-9004-06b318d00418" (UID: "c0e1a030-9d0e-481d-9004-06b318d00418"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.146818 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.146793 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2" (OuterVolumeSpecName: "kube-api-access-2vsn2") pod "c0e1a030-9d0e-481d-9004-06b318d00418" (UID: "c0e1a030-9d0e-481d-9004-06b318d00418"). InnerVolumeSpecName "kube-api-access-2vsn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:32:29.150001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.149963 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util" (OuterVolumeSpecName: "util") pod "c0e1a030-9d0e-481d-9004-06b318d00418" (UID: "c0e1a030-9d0e-481d-9004-06b318d00418"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:32:29.246455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.246413 2571 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-bundle\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.246455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.246449 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2vsn2\" (UniqueName: \"kubernetes.io/projected/c0e1a030-9d0e-481d-9004-06b318d00418-kube-api-access-2vsn2\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.246455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.246461 2571 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0e1a030-9d0e-481d-9004-06b318d00418-util\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:32:29.761158 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.761096 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" event={"ID":"c0e1a030-9d0e-481d-9004-06b318d00418","Type":"ContainerDied","Data":"88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b"} Apr 16 16:32:29.761158 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.761160 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88a78428707ba648440e06fd074a4511374024e32f2a5cf966612d1189edab4b" Apr 16 16:32:29.761636 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.761155 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b107722e7a95ad7441cca2226d3d656a8bf76dc65e15066bc01f8e5036bfrg" Apr 16 16:32:29.762872 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.762846 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" event={"ID":"06d44bae-1364-44e0-a0cb-b44e76100441","Type":"ContainerDied","Data":"e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c"} Apr 16 16:32:29.763001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.762877 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e226111eadd7f09ca82cb531082b456c44e680a99f6389d2763c9004e542255c" Apr 16 16:32:29.763001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.762893 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/309c32b5cc4cbd8148882f0ed2adbe6c47ee2761cc0c22c627755a3c30zjbch" Apr 16 16:32:29.764535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.764515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" event={"ID":"bfb6975d-f4e3-40ad-a4c5-0c2bc107d860","Type":"ContainerDied","Data":"91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac"} Apr 16 16:32:29.764535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.764536 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91df7cf0c436c72a4965d1d801944e44028b11c99aaa571e166f632b5b3e38ac" Apr 16 16:32:29.764661 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:29.764544 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/ef1189e9861fa30b9414ceb420c2d78e85403a7e10097f37afdacfec889fks6" Apr 16 16:32:47.555092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555053 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v"] Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555414 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="util" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555426 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="util" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555437 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="pull" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555442 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="pull" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555451 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="pull" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555457 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="pull" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555464 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="extract" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555469 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="extract" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555476 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="extract" Apr 16 16:32:47.555480 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555481 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555489 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="pull" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555494 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="pull" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555505 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555509 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555518 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="pull" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555523 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="pull" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555530 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555536 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555542 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555548 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555557 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555562 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555567 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555572 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="util" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555621 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0e1a030-9d0e-481d-9004-06b318d00418" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555627 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="06d44bae-1364-44e0-a0cb-b44e76100441" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555634 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bfb6975d-f4e3-40ad-a4c5-0c2bc107d860" containerName="extract" Apr 16 16:32:47.555815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.555643 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a132f0c6-3ca4-4444-9416-077465e4b4d8" containerName="extract" Apr 16 16:32:47.558458 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.558441 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.560889 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.560863 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:32:47.560996 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.560916 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 16:32:47.561062 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.561035 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 16:32:47.561174 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.561161 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-szw9h\"" Apr 16 16:32:47.561739 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.561721 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:32:47.567411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.567386 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v"] Apr 16 16:32:47.606778 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.606742 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca68a685-0c86-4975-91b3-eb93ee8b65b6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.606917 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.606783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sknd\" (UniqueName: \"kubernetes.io/projected/ca68a685-0c86-4975-91b3-eb93ee8b65b6-kube-api-access-9sknd\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.606917 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.606891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca68a685-0c86-4975-91b3-eb93ee8b65b6-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.707386 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.707356 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca68a685-0c86-4975-91b3-eb93ee8b65b6-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.707557 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.707440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca68a685-0c86-4975-91b3-eb93ee8b65b6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.707557 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.707471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sknd\" (UniqueName: \"kubernetes.io/projected/ca68a685-0c86-4975-91b3-eb93ee8b65b6-kube-api-access-9sknd\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.708009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.707985 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca68a685-0c86-4975-91b3-eb93ee8b65b6-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.710247 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.710217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca68a685-0c86-4975-91b3-eb93ee8b65b6-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.718913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.718882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sknd\" (UniqueName: \"kubernetes.io/projected/ca68a685-0c86-4975-91b3-eb93ee8b65b6-kube-api-access-9sknd\") pod \"kuadrant-console-plugin-6c886788f8-ncp8v\" (UID: \"ca68a685-0c86-4975-91b3-eb93ee8b65b6\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:47.868235 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:47.868158 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" Apr 16 16:32:48.003099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:48.003074 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v"] Apr 16 16:32:48.005292 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:32:48.005261 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca68a685_0c86_4975_91b3_eb93ee8b65b6.slice/crio-1aa74494263184f38d9b2e321c2c5ed1e805c091ea4335f9124b66be6acf1f8f WatchSource:0}: Error finding container 1aa74494263184f38d9b2e321c2c5ed1e805c091ea4335f9124b66be6acf1f8f: Status 404 returned error can't find the container with id 1aa74494263184f38d9b2e321c2c5ed1e805c091ea4335f9124b66be6acf1f8f Apr 16 16:32:48.839608 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:48.839556 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" event={"ID":"ca68a685-0c86-4975-91b3-eb93ee8b65b6","Type":"ContainerStarted","Data":"1aa74494263184f38d9b2e321c2c5ed1e805c091ea4335f9124b66be6acf1f8f"} Apr 16 16:32:52.856606 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:52.856559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" event={"ID":"ca68a685-0c86-4975-91b3-eb93ee8b65b6","Type":"ContainerStarted","Data":"c8e2e5cdcd8d723a76525a97df03bf626e57fe375361fa9501fdf4a17a7e73a7"} Apr 16 16:32:52.882726 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:32:52.882670 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-ncp8v" podStartSLOduration=1.499171008 podStartE2EDuration="5.882655908s" podCreationTimestamp="2026-04-16 16:32:47 +0000 UTC" firstStartedPulling="2026-04-16 16:32:48.006592937 +0000 UTC m=+571.841445819" lastFinishedPulling="2026-04-16 16:32:52.390077837 +0000 UTC m=+576.224930719" observedRunningTime="2026-04-16 16:32:52.880297335 +0000 UTC m=+576.715150242" watchObservedRunningTime="2026-04-16 16:32:52.882655908 +0000 UTC m=+576.717508811" Apr 16 16:33:16.688363 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:33:16.688330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:33:16.690233 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:33:16.690209 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:33:16.691802 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:33:16.691777 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:33:16.694695 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:33:16.694675 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:34:22.511753 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.511717 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:22.515233 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.515212 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:22.518570 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.518547 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-f89hw\"" Apr 16 16:34:22.532576 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.532542 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:22.554697 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.554661 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wljd\" (UniqueName: \"kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd\") pod \"authorino-674b59b84c-d4kjz\" (UID: \"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3\") " pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:22.655452 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.655408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wljd\" (UniqueName: \"kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd\") pod \"authorino-674b59b84c-d4kjz\" (UID: \"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3\") " pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:22.674499 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.674465 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wljd\" (UniqueName: \"kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd\") pod \"authorino-674b59b84c-d4kjz\" (UID: \"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3\") " pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:22.771287 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.771202 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:22.774599 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.774581 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:22.788906 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.788878 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:22.825249 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.825212 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:22.856278 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.856238 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqvhk\" (UniqueName: \"kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk\") pod \"authorino-79cbc94b89-dx9cb\" (UID: \"0f19f5b1-6ec8-47ce-afa2-8b689345bedf\") " pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:22.958007 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.957757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvhk\" (UniqueName: \"kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk\") pod \"authorino-79cbc94b89-dx9cb\" (UID: \"0f19f5b1-6ec8-47ce-afa2-8b689345bedf\") " pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:22.958430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.958406 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:22.960442 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:34:22.960408 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08b4df05_dbc3_409b_8dec_5e7dbc4fbec3.slice/crio-daf74df709f6cb9b1ece63cd0f5dd5ad03e818ac9bd5d46064e70e982bd0a7ec WatchSource:0}: Error finding container daf74df709f6cb9b1ece63cd0f5dd5ad03e818ac9bd5d46064e70e982bd0a7ec: Status 404 returned error can't find the container with id daf74df709f6cb9b1ece63cd0f5dd5ad03e818ac9bd5d46064e70e982bd0a7ec Apr 16 16:34:22.970078 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:22.970053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqvhk\" (UniqueName: \"kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk\") pod \"authorino-79cbc94b89-dx9cb\" (UID: \"0f19f5b1-6ec8-47ce-afa2-8b689345bedf\") " pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:23.083485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:23.083395 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:23.196871 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:23.196836 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-d4kjz" event={"ID":"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3","Type":"ContainerStarted","Data":"daf74df709f6cb9b1ece63cd0f5dd5ad03e818ac9bd5d46064e70e982bd0a7ec"} Apr 16 16:34:23.233530 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:23.233504 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:23.235638 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:34:23.235609 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f19f5b1_6ec8_47ce_afa2_8b689345bedf.slice/crio-95eed9fd9a10b83c13b27e3a0103be7e0a9872c26cfbca44d5212353ccd9fb1b WatchSource:0}: Error finding container 95eed9fd9a10b83c13b27e3a0103be7e0a9872c26cfbca44d5212353ccd9fb1b: Status 404 returned error can't find the container with id 95eed9fd9a10b83c13b27e3a0103be7e0a9872c26cfbca44d5212353ccd9fb1b Apr 16 16:34:24.203009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:24.202968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" event={"ID":"0f19f5b1-6ec8-47ce-afa2-8b689345bedf","Type":"ContainerStarted","Data":"95eed9fd9a10b83c13b27e3a0103be7e0a9872c26cfbca44d5212353ccd9fb1b"} Apr 16 16:34:26.213503 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:26.213408 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" event={"ID":"0f19f5b1-6ec8-47ce-afa2-8b689345bedf","Type":"ContainerStarted","Data":"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e"} Apr 16 16:34:26.214898 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:26.214873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-d4kjz" event={"ID":"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3","Type":"ContainerStarted","Data":"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530"} Apr 16 16:34:26.235278 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:26.235229 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" podStartSLOduration=1.52546577 podStartE2EDuration="4.235213828s" podCreationTimestamp="2026-04-16 16:34:22 +0000 UTC" firstStartedPulling="2026-04-16 16:34:23.237034115 +0000 UTC m=+667.071886998" lastFinishedPulling="2026-04-16 16:34:25.94678217 +0000 UTC m=+669.781635056" observedRunningTime="2026-04-16 16:34:26.23293614 +0000 UTC m=+670.067789089" watchObservedRunningTime="2026-04-16 16:34:26.235213828 +0000 UTC m=+670.070066775" Apr 16 16:34:26.255949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:26.255893 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-d4kjz" podStartSLOduration=1.282712323 podStartE2EDuration="4.255877594s" podCreationTimestamp="2026-04-16 16:34:22 +0000 UTC" firstStartedPulling="2026-04-16 16:34:22.961619478 +0000 UTC m=+666.796472359" lastFinishedPulling="2026-04-16 16:34:25.934784734 +0000 UTC m=+669.769637630" observedRunningTime="2026-04-16 16:34:26.255206112 +0000 UTC m=+670.090059017" watchObservedRunningTime="2026-04-16 16:34:26.255877594 +0000 UTC m=+670.090730515" Apr 16 16:34:26.286027 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:26.285991 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:28.222927 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:28.222867 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-d4kjz" podUID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" containerName="authorino" containerID="cri-o://11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530" gracePeriod=30 Apr 16 16:34:28.468504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:28.468481 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:28.498249 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:28.498153 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wljd\" (UniqueName: \"kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd\") pod \"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3\" (UID: \"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3\") " Apr 16 16:34:28.500449 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:28.500421 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd" (OuterVolumeSpecName: "kube-api-access-4wljd") pod "08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" (UID: "08b4df05-dbc3-409b-8dec-5e7dbc4fbec3"). InnerVolumeSpecName "kube-api-access-4wljd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:28.599157 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:28.599095 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4wljd\" (UniqueName: \"kubernetes.io/projected/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3-kube-api-access-4wljd\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:34:29.228021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.227987 2571 generic.go:358] "Generic (PLEG): container finished" podID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" containerID="11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530" exitCode=0 Apr 16 16:34:29.228477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.228038 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-d4kjz" Apr 16 16:34:29.228477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.228037 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-d4kjz" event={"ID":"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3","Type":"ContainerDied","Data":"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530"} Apr 16 16:34:29.228477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.228082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-d4kjz" event={"ID":"08b4df05-dbc3-409b-8dec-5e7dbc4fbec3","Type":"ContainerDied","Data":"daf74df709f6cb9b1ece63cd0f5dd5ad03e818ac9bd5d46064e70e982bd0a7ec"} Apr 16 16:34:29.228477 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.228102 2571 scope.go:117] "RemoveContainer" containerID="11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530" Apr 16 16:34:29.237229 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.237209 2571 scope.go:117] "RemoveContainer" containerID="11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530" Apr 16 16:34:29.237509 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:34:29.237486 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530\": container with ID starting with 11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530 not found: ID does not exist" containerID="11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530" Apr 16 16:34:29.237569 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.237520 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530"} err="failed to get container status \"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530\": rpc error: code = NotFound desc = could not find container \"11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530\": container with ID starting with 11521a133ef1167a8496d69646b3da97e5ca65c335d6e28bb17ed3eb6d9a6530 not found: ID does not exist" Apr 16 16:34:29.245212 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.245181 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:29.248990 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:29.248967 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-d4kjz"] Apr 16 16:34:30.755654 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:30.755616 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" path="/var/lib/kubelet/pods/08b4df05-dbc3-409b-8dec-5e7dbc4fbec3/volumes" Apr 16 16:34:39.008805 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.008771 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:39.009248 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.008979 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" podUID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" containerName="authorino" containerID="cri-o://c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e" gracePeriod=30 Apr 16 16:34:39.254038 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.254009 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:39.269304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.269202 2571 generic.go:358] "Generic (PLEG): container finished" podID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" containerID="c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e" exitCode=0 Apr 16 16:34:39.269304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.269241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" event={"ID":"0f19f5b1-6ec8-47ce-afa2-8b689345bedf","Type":"ContainerDied","Data":"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e"} Apr 16 16:34:39.269304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.269266 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" Apr 16 16:34:39.269304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.269278 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-dx9cb" event={"ID":"0f19f5b1-6ec8-47ce-afa2-8b689345bedf","Type":"ContainerDied","Data":"95eed9fd9a10b83c13b27e3a0103be7e0a9872c26cfbca44d5212353ccd9fb1b"} Apr 16 16:34:39.269304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.269298 2571 scope.go:117] "RemoveContainer" containerID="c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e" Apr 16 16:34:39.279445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.279421 2571 scope.go:117] "RemoveContainer" containerID="c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e" Apr 16 16:34:39.279740 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:34:39.279721 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e\": container with ID starting with c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e not found: ID does not exist" containerID="c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e" Apr 16 16:34:39.279793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.279751 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e"} err="failed to get container status \"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e\": rpc error: code = NotFound desc = could not find container \"c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e\": container with ID starting with c354a2f66286bcb447b0a3ca0876e6323f52c8cd1b744a917e7e8483c34fb98e not found: ID does not exist" Apr 16 16:34:39.298735 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.298708 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqvhk\" (UniqueName: \"kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk\") pod \"0f19f5b1-6ec8-47ce-afa2-8b689345bedf\" (UID: \"0f19f5b1-6ec8-47ce-afa2-8b689345bedf\") " Apr 16 16:34:39.300664 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.300637 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk" (OuterVolumeSpecName: "kube-api-access-rqvhk") pod "0f19f5b1-6ec8-47ce-afa2-8b689345bedf" (UID: "0f19f5b1-6ec8-47ce-afa2-8b689345bedf"). InnerVolumeSpecName "kube-api-access-rqvhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:34:39.399956 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.399918 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqvhk\" (UniqueName: \"kubernetes.io/projected/0f19f5b1-6ec8-47ce-afa2-8b689345bedf-kube-api-access-rqvhk\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:34:39.590343 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.590297 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:39.593493 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:39.593462 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-dx9cb"] Apr 16 16:34:40.755601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:40.755568 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" path="/var/lib/kubelet/pods/0f19f5b1-6ec8-47ce-afa2-8b689345bedf/volumes" Apr 16 16:34:58.238397 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238358 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:34:58.238915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238809 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" containerName="authorino" Apr 16 16:34:58.238915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238826 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" containerName="authorino" Apr 16 16:34:58.238915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238849 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" containerName="authorino" Apr 16 16:34:58.238915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238857 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" containerName="authorino" Apr 16 16:34:58.239068 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238942 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f19f5b1-6ec8-47ce-afa2-8b689345bedf" containerName="authorino" Apr 16 16:34:58.239068 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.238958 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="08b4df05-dbc3-409b-8dec-5e7dbc4fbec3" containerName="authorino" Apr 16 16:34:58.242936 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.242913 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.246387 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.246362 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:34:58.246577 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.246561 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:34:58.247343 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.247324 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2mxhh\"" Apr 16 16:34:58.248505 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.248485 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:34:58.253105 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.253081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:34:58.372269 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.372224 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.372459 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.372364 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mjr\" (UniqueName: \"kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.473186 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.473141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.473363 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.473244 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28mjr\" (UniqueName: \"kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.475684 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.475661 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.482684 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.482651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mjr\" (UniqueName: \"kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr\") pod \"llmisvc-controller-manager-dcccd46b9-qvj2p\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.554692 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.554604 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:34:58.689749 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.689710 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:34:58.694253 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:34:58.694218 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podca0915d6_9e74_4495_a517_0eaf62c5ef18.slice/crio-a22cd03f24c53fe16151c430473a1f36fa6738eec83c67a7db7182053b28488c WatchSource:0}: Error finding container a22cd03f24c53fe16151c430473a1f36fa6738eec83c67a7db7182053b28488c: Status 404 returned error can't find the container with id a22cd03f24c53fe16151c430473a1f36fa6738eec83c67a7db7182053b28488c Apr 16 16:34:58.695601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:58.695581 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:34:59.351177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:34:59.351110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" event={"ID":"ca0915d6-9e74-4495-a517-0eaf62c5ef18","Type":"ContainerStarted","Data":"a22cd03f24c53fe16151c430473a1f36fa6738eec83c67a7db7182053b28488c"} Apr 16 16:35:03.369801 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:35:03.369764 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" event={"ID":"ca0915d6-9e74-4495-a517-0eaf62c5ef18","Type":"ContainerStarted","Data":"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb"} Apr 16 16:35:03.370243 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:35:03.369888 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:35:03.406508 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:35:03.406452 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" podStartSLOduration=1.8158853179999999 podStartE2EDuration="5.406436538s" podCreationTimestamp="2026-04-16 16:34:58 +0000 UTC" firstStartedPulling="2026-04-16 16:34:58.695755367 +0000 UTC m=+702.530608250" lastFinishedPulling="2026-04-16 16:35:02.286306587 +0000 UTC m=+706.121159470" observedRunningTime="2026-04-16 16:35:03.401523795 +0000 UTC m=+707.236376702" watchObservedRunningTime="2026-04-16 16:35:03.406436538 +0000 UTC m=+707.241289443" Apr 16 16:35:34.376489 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:35:34.376459 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:36:09.459263 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.459230 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-q5xtr"] Apr 16 16:36:09.461926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.461910 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.465643 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.465619 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 16:36:09.465787 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.465666 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bmw94\"" Apr 16 16:36:09.496836 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.496794 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/073d2f21-2a54-4b10-b12b-4a5daaa15777-cert\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.497035 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.496850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpz7\" (UniqueName: \"kubernetes.io/projected/073d2f21-2a54-4b10-b12b-4a5daaa15777-kube-api-access-glpz7\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.509342 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.509309 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-q5xtr"] Apr 16 16:36:09.597554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.597516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/073d2f21-2a54-4b10-b12b-4a5daaa15777-cert\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.597743 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.597567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glpz7\" (UniqueName: \"kubernetes.io/projected/073d2f21-2a54-4b10-b12b-4a5daaa15777-kube-api-access-glpz7\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.599984 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.599953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/073d2f21-2a54-4b10-b12b-4a5daaa15777-cert\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.617169 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.617138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpz7\" (UniqueName: \"kubernetes.io/projected/073d2f21-2a54-4b10-b12b-4a5daaa15777-kube-api-access-glpz7\") pod \"odh-model-controller-696fc77849-q5xtr\" (UID: \"073d2f21-2a54-4b10-b12b-4a5daaa15777\") " pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.772703 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.772595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:09.920485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:09.920435 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-q5xtr"] Apr 16 16:36:09.930944 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:36:09.930911 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod073d2f21_2a54_4b10_b12b_4a5daaa15777.slice/crio-89778eb075d18bab24d8b8dff879a6e7a2a39867f5eb8c7a450ab4ffeeae26ab WatchSource:0}: Error finding container 89778eb075d18bab24d8b8dff879a6e7a2a39867f5eb8c7a450ab4ffeeae26ab: Status 404 returned error can't find the container with id 89778eb075d18bab24d8b8dff879a6e7a2a39867f5eb8c7a450ab4ffeeae26ab Apr 16 16:36:10.632887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:10.632828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-q5xtr" event={"ID":"073d2f21-2a54-4b10-b12b-4a5daaa15777","Type":"ContainerStarted","Data":"89778eb075d18bab24d8b8dff879a6e7a2a39867f5eb8c7a450ab4ffeeae26ab"} Apr 16 16:36:12.644187 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:12.644143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-q5xtr" event={"ID":"073d2f21-2a54-4b10-b12b-4a5daaa15777","Type":"ContainerStarted","Data":"3344b1d265761cb6152a6a5b2f0cc56edca9465e330b5c6b879762b083cb34a0"} Apr 16 16:36:12.644560 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:12.644263 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:12.696949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:12.696888 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-q5xtr" podStartSLOduration=1.118248894 podStartE2EDuration="3.6968712s" podCreationTimestamp="2026-04-16 16:36:09 +0000 UTC" firstStartedPulling="2026-04-16 16:36:09.932099462 +0000 UTC m=+773.766952345" lastFinishedPulling="2026-04-16 16:36:12.51072177 +0000 UTC m=+776.345574651" observedRunningTime="2026-04-16 16:36:12.695042275 +0000 UTC m=+776.529895178" watchObservedRunningTime="2026-04-16 16:36:12.6968712 +0000 UTC m=+776.531724104" Apr 16 16:36:23.651276 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:23.651247 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-q5xtr" Apr 16 16:36:24.551700 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.551665 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-84kpx"] Apr 16 16:36:24.554241 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.554222 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84kpx" Apr 16 16:36:24.557220 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.557192 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-b7dsj\"" Apr 16 16:36:24.557416 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.557400 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:36:24.575771 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.575735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-84kpx"] Apr 16 16:36:24.619211 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.619180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkrfq\" (UniqueName: \"kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq\") pod \"s3-init-84kpx\" (UID: \"e9cf7092-889c-45ab-8613-967b93b85c04\") " pod="kserve/s3-init-84kpx" Apr 16 16:36:24.719751 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.719707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkrfq\" (UniqueName: \"kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq\") pod \"s3-init-84kpx\" (UID: \"e9cf7092-889c-45ab-8613-967b93b85c04\") " pod="kserve/s3-init-84kpx" Apr 16 16:36:24.733528 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.733495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkrfq\" (UniqueName: \"kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq\") pod \"s3-init-84kpx\" (UID: \"e9cf7092-889c-45ab-8613-967b93b85c04\") " pod="kserve/s3-init-84kpx" Apr 16 16:36:24.863677 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.863590 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84kpx" Apr 16 16:36:24.999385 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:24.999360 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-84kpx"] Apr 16 16:36:25.001733 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:36:25.001699 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9cf7092_889c_45ab_8613_967b93b85c04.slice/crio-66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f WatchSource:0}: Error finding container 66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f: Status 404 returned error can't find the container with id 66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f Apr 16 16:36:25.701388 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:25.701348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84kpx" event={"ID":"e9cf7092-889c-45ab-8613-967b93b85c04","Type":"ContainerStarted","Data":"66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f"} Apr 16 16:36:29.721438 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:29.721397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84kpx" event={"ID":"e9cf7092-889c-45ab-8613-967b93b85c04","Type":"ContainerStarted","Data":"281262f8d033b72bbec3428cbcf8b1a6b12fe27e1c8996b739bb1ac5be1c5e11"} Apr 16 16:36:29.754722 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:29.754667 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-84kpx" podStartSLOduration=1.403416115 podStartE2EDuration="5.75465172s" podCreationTimestamp="2026-04-16 16:36:24 +0000 UTC" firstStartedPulling="2026-04-16 16:36:25.003683985 +0000 UTC m=+788.838536878" lastFinishedPulling="2026-04-16 16:36:29.35491959 +0000 UTC m=+793.189772483" observedRunningTime="2026-04-16 16:36:29.752410147 +0000 UTC m=+793.587263051" watchObservedRunningTime="2026-04-16 16:36:29.75465172 +0000 UTC m=+793.589504623" Apr 16 16:36:32.735429 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:32.735394 2571 generic.go:358] "Generic (PLEG): container finished" podID="e9cf7092-889c-45ab-8613-967b93b85c04" containerID="281262f8d033b72bbec3428cbcf8b1a6b12fe27e1c8996b739bb1ac5be1c5e11" exitCode=0 Apr 16 16:36:32.735803 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:32.735467 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84kpx" event={"ID":"e9cf7092-889c-45ab-8613-967b93b85c04","Type":"ContainerDied","Data":"281262f8d033b72bbec3428cbcf8b1a6b12fe27e1c8996b739bb1ac5be1c5e11"} Apr 16 16:36:33.878490 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:33.878460 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84kpx" Apr 16 16:36:34.011956 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.011868 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkrfq\" (UniqueName: \"kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq\") pod \"e9cf7092-889c-45ab-8613-967b93b85c04\" (UID: \"e9cf7092-889c-45ab-8613-967b93b85c04\") " Apr 16 16:36:34.014104 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.014078 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq" (OuterVolumeSpecName: "kube-api-access-bkrfq") pod "e9cf7092-889c-45ab-8613-967b93b85c04" (UID: "e9cf7092-889c-45ab-8613-967b93b85c04"). InnerVolumeSpecName "kube-api-access-bkrfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:36:34.112729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.112690 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkrfq\" (UniqueName: \"kubernetes.io/projected/e9cf7092-889c-45ab-8613-967b93b85c04-kube-api-access-bkrfq\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:36:34.744691 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.744662 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-84kpx" Apr 16 16:36:34.744691 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.744676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-84kpx" event={"ID":"e9cf7092-889c-45ab-8613-967b93b85c04","Type":"ContainerDied","Data":"66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f"} Apr 16 16:36:34.744915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:36:34.744705 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66115bea2916d94bec8280db0f8f068fd5fe974dc0bc52db61bb80bf40b2ce6f" Apr 16 16:37:01.084455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.084414 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:01.085026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.084997 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9cf7092-889c-45ab-8613-967b93b85c04" containerName="s3-init" Apr 16 16:37:01.085149 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.085029 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9cf7092-889c-45ab-8613-967b93b85c04" containerName="s3-init" Apr 16 16:37:01.085149 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.085133 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9cf7092-889c-45ab-8613-967b93b85c04" containerName="s3-init" Apr 16 16:37:01.101552 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.101476 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:01.101742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.101654 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.106040 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.106016 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:37:01.106210 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.106138 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:37:01.106626 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.106606 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 16:37:01.106735 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.106716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:37:01.169373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169332 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.169573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169380 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.169573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.169573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169528 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.169713 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.169713 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.169632 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w79vx\" (UniqueName: \"kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270513 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270585 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w79vx\" (UniqueName: \"kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270718 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270711 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270939 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.270997 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.271054 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.270989 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.271183 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.271159 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.273196 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.273174 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.273376 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.273360 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.280554 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.280523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w79vx\" (UniqueName: \"kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx\") pod \"scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.413792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.413750 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:01.560417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.560383 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:01.562607 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:37:01.562579 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a788cc_188a_4dc0_a974_25dd1228bda1.slice/crio-92b7653fcb9e9960141e143bb9744c57aca6206d029b14983b3494292fc7e6f6 WatchSource:0}: Error finding container 92b7653fcb9e9960141e143bb9744c57aca6206d029b14983b3494292fc7e6f6: Status 404 returned error can't find the container with id 92b7653fcb9e9960141e143bb9744c57aca6206d029b14983b3494292fc7e6f6 Apr 16 16:37:01.853256 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:01.853162 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerStarted","Data":"92b7653fcb9e9960141e143bb9744c57aca6206d029b14983b3494292fc7e6f6"} Apr 16 16:37:05.872160 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:05.872106 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerStarted","Data":"016c9e3aec8949b68b6ae10a40fb016ab3aea430dc01fc7b19625ae7a79456ad"} Apr 16 16:37:09.895592 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:09.895509 2571 generic.go:358] "Generic (PLEG): container finished" podID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerID="016c9e3aec8949b68b6ae10a40fb016ab3aea430dc01fc7b19625ae7a79456ad" exitCode=0 Apr 16 16:37:09.895592 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:09.895579 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerDied","Data":"016c9e3aec8949b68b6ae10a40fb016ab3aea430dc01fc7b19625ae7a79456ad"} Apr 16 16:37:12.910852 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:12.910814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerStarted","Data":"ea40c1b717eb2863741529191013de0d3cf0e295468835d668c3422852fbae5c"} Apr 16 16:37:12.934063 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:12.934003 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" podStartSLOduration=1.002142903 podStartE2EDuration="11.933981293s" podCreationTimestamp="2026-04-16 16:37:01 +0000 UTC" firstStartedPulling="2026-04-16 16:37:01.564614702 +0000 UTC m=+825.399467583" lastFinishedPulling="2026-04-16 16:37:12.496453079 +0000 UTC m=+836.331305973" observedRunningTime="2026-04-16 16:37:12.933188854 +0000 UTC m=+836.768041758" watchObservedRunningTime="2026-04-16 16:37:12.933981293 +0000 UTC m=+836.768834200" Apr 16 16:37:21.414839 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:21.414785 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:21.415410 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:21.414856 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:21.427969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:21.427929 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:21.958660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:21.958632 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:44.823920 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:44.823880 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:44.824462 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:44.824278 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="main" containerID="cri-o://ea40c1b717eb2863741529191013de0d3cf0e295468835d668c3422852fbae5c" gracePeriod=30 Apr 16 16:37:45.036706 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.036669 2571 generic.go:358] "Generic (PLEG): container finished" podID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerID="ea40c1b717eb2863741529191013de0d3cf0e295468835d668c3422852fbae5c" exitCode=0 Apr 16 16:37:45.036900 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.036737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerDied","Data":"ea40c1b717eb2863741529191013de0d3cf0e295468835d668c3422852fbae5c"} Apr 16 16:37:45.076061 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.075993 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:45.084785 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084762 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.084872 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084814 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w79vx\" (UniqueName: \"kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.084872 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084834 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.084872 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084858 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.085031 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084876 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.085031 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.084898 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location\") pod \"89a788cc-188a-4dc0-a974-25dd1228bda1\" (UID: \"89a788cc-188a-4dc0-a974-25dd1228bda1\") " Apr 16 16:37:45.085229 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.085165 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home" (OuterVolumeSpecName: "home") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.085229 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.085180 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache" (OuterVolumeSpecName: "model-cache") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.087442 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.087401 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx" (OuterVolumeSpecName: "kube-api-access-w79vx") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "kube-api-access-w79vx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:37:45.087579 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.087469 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm" (OuterVolumeSpecName: "dshm") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.087579 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.087481 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:37:45.144996 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.144941 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89a788cc-188a-4dc0-a974-25dd1228bda1" (UID: "89a788cc-188a-4dc0-a974-25dd1228bda1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:45.186103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186062 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.186103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186097 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w79vx\" (UniqueName: \"kubernetes.io/projected/89a788cc-188a-4dc0-a974-25dd1228bda1-kube-api-access-w79vx\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.186103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186110 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.186365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186143 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.186365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186152 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/89a788cc-188a-4dc0-a974-25dd1228bda1-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:45.186365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:45.186160 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89a788cc-188a-4dc0-a974-25dd1228bda1-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:37:46.042367 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.042335 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" Apr 16 16:37:46.042807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.042352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk" event={"ID":"89a788cc-188a-4dc0-a974-25dd1228bda1","Type":"ContainerDied","Data":"92b7653fcb9e9960141e143bb9744c57aca6206d029b14983b3494292fc7e6f6"} Apr 16 16:37:46.042807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.042410 2571 scope.go:117] "RemoveContainer" containerID="ea40c1b717eb2863741529191013de0d3cf0e295468835d668c3422852fbae5c" Apr 16 16:37:46.052256 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.052235 2571 scope.go:117] "RemoveContainer" containerID="016c9e3aec8949b68b6ae10a40fb016ab3aea430dc01fc7b19625ae7a79456ad" Apr 16 16:37:46.067935 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.067898 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:46.071334 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.071309 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-f6cb4c9b9-thlbk"] Apr 16 16:37:46.755864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:46.755832 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" path="/var/lib/kubelet/pods/89a788cc-188a-4dc0-a974-25dd1228bda1/volumes" Apr 16 16:37:49.834426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.834390 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:37:49.835026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.835005 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="main" Apr 16 16:37:49.835026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.835028 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="main" Apr 16 16:37:49.835194 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.835049 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="storage-initializer" Apr 16 16:37:49.835194 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.835058 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="storage-initializer" Apr 16 16:37:49.835194 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.835182 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="89a788cc-188a-4dc0-a974-25dd1228bda1" containerName="main" Apr 16 16:37:49.840316 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.840293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.843949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.843930 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:37:49.844058 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.843934 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:37:49.844855 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.844833 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 16:37:49.844983 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.844865 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:37:49.854821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.854780 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:37:49.932819 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.932980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.932980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932872 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.932980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932889 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfv7\" (UniqueName: \"kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.932980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:49.932980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:49.932928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034189 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034148 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034196 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfv7\" (UniqueName: \"kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034296 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034393 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034327 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034668 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034668 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.034767 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.034685 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.036670 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.036636 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.036847 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.036827 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.045173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.045143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfv7\" (UniqueName: \"kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7\") pod \"scheduler-ha-replicas-test-kserve-c467c496f-t4nb2\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.143041 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.142962 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:37:50.146429 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.146404 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.151444 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.151417 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:37:50.152529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.152511 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-8zwgx\"" Apr 16 16:37:50.169832 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.169800 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:37:50.235490 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.235456 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.235654 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.235504 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.235654 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.235621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsrp\" (UniqueName: \"kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.235792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.235683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.292398 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.292373 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:37:50.294838 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:37:50.294811 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d6b444_fcd9_4396_a25b_210015f70028.slice/crio-e94332415e79791f3614d94fe8baa4f495d3958336e5a353e4e694e1c0234a93 WatchSource:0}: Error finding container e94332415e79791f3614d94fe8baa4f495d3958336e5a353e4e694e1c0234a93: Status 404 returned error can't find the container with id e94332415e79791f3614d94fe8baa4f495d3958336e5a353e4e694e1c0234a93 Apr 16 16:37:50.337012 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.336969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.337195 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.337024 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.337195 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.337094 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsrp\" (UniqueName: \"kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.337195 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.337164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.337372 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.337336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.337437 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.337375 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.339550 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.339528 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.346793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.346772 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsrp\" (UniqueName: \"kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.456510 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.456468 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:37:50.602559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:50.602526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:37:50.603932 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:37:50.603900 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4c0643_710f_40f7_b5a3_59977c385ef9.slice/crio-03824a0a72dda9020c0b81c741c286eabbbd75fa94e03750539456a4fb109d71 WatchSource:0}: Error finding container 03824a0a72dda9020c0b81c741c286eabbbd75fa94e03750539456a4fb109d71: Status 404 returned error can't find the container with id 03824a0a72dda9020c0b81c741c286eabbbd75fa94e03750539456a4fb109d71 Apr 16 16:37:51.063572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:51.063535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerStarted","Data":"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca"} Apr 16 16:37:51.063572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:51.063575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerStarted","Data":"03824a0a72dda9020c0b81c741c286eabbbd75fa94e03750539456a4fb109d71"} Apr 16 16:37:51.065241 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:51.065202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerStarted","Data":"1dd2d152ccfa8d11e63e8245d25299ba6f94cf9a13f3640966c625c3acefd2bc"} Apr 16 16:37:51.065370 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:51.065242 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerStarted","Data":"e94332415e79791f3614d94fe8baa4f495d3958336e5a353e4e694e1c0234a93"} Apr 16 16:37:52.070566 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:52.070529 2571 generic.go:358] "Generic (PLEG): container finished" podID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerID="4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca" exitCode=0 Apr 16 16:37:52.071009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:52.070605 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerDied","Data":"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca"} Apr 16 16:37:55.091031 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:55.090986 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerStarted","Data":"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d"} Apr 16 16:37:55.093058 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:55.093026 2571 generic.go:358] "Generic (PLEG): container finished" podID="53d6b444-fcd9-4396-a25b-210015f70028" containerID="1dd2d152ccfa8d11e63e8245d25299ba6f94cf9a13f3640966c625c3acefd2bc" exitCode=0 Apr 16 16:37:55.093257 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:55.093230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerDied","Data":"1dd2d152ccfa8d11e63e8245d25299ba6f94cf9a13f3640966c625c3acefd2bc"} Apr 16 16:37:56.103239 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:56.103200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerStarted","Data":"04cf08b5d89e1c223ef6582786250892a88cda879a1aff1fbdb327261a8c9b8f"} Apr 16 16:37:56.127740 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:37:56.127662 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" podStartSLOduration=7.127641115 podStartE2EDuration="7.127641115s" podCreationTimestamp="2026-04-16 16:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:37:56.123907808 +0000 UTC m=+879.958760706" watchObservedRunningTime="2026-04-16 16:37:56.127641115 +0000 UTC m=+879.962494022" Apr 16 16:38:00.127392 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.127290 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerStarted","Data":"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0"} Apr 16 16:38:00.127779 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.127402 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:00.152293 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.152240 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" podStartSLOduration=2.474537777 podStartE2EDuration="10.152224013s" podCreationTimestamp="2026-04-16 16:37:50 +0000 UTC" firstStartedPulling="2026-04-16 16:37:52.071898101 +0000 UTC m=+875.906750982" lastFinishedPulling="2026-04-16 16:37:59.749584334 +0000 UTC m=+883.584437218" observedRunningTime="2026-04-16 16:38:00.149857421 +0000 UTC m=+883.984710318" watchObservedRunningTime="2026-04-16 16:38:00.152224013 +0000 UTC m=+883.987076918" Apr 16 16:38:00.152501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.152413 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:00.152501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.152453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:00.165014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.164980 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:00.457143 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.457084 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:00.457143 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:00.457137 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:01.143307 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:01.143275 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:10.458815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:10.458732 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:10.460056 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:10.460037 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:16.727622 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:16.727590 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:38:16.728193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:16.727848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:38:16.731372 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:16.731351 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:38:16.731637 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:16.731611 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:38:26.019287 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.019243 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:38:26.019895 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.019541 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="main" containerID="cri-o://04cf08b5d89e1c223ef6582786250892a88cda879a1aff1fbdb327261a8c9b8f" gracePeriod=30 Apr 16 16:38:26.021852 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.021824 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:38:26.022183 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.022154 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="main" containerID="cri-o://b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d" gracePeriod=30 Apr 16 16:38:26.022433 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.022220 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="tokenizer" containerID="cri-o://4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0" gracePeriod=30 Apr 16 16:38:26.024272 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.024244 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 16:38:26.232237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.232200 2571 generic.go:358] "Generic (PLEG): container finished" podID="53d6b444-fcd9-4396-a25b-210015f70028" containerID="04cf08b5d89e1c223ef6582786250892a88cda879a1aff1fbdb327261a8c9b8f" exitCode=0 Apr 16 16:38:26.232430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.232239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerDied","Data":"04cf08b5d89e1c223ef6582786250892a88cda879a1aff1fbdb327261a8c9b8f"} Apr 16 16:38:26.234498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.234469 2571 generic.go:358] "Generic (PLEG): container finished" podID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerID="b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d" exitCode=0 Apr 16 16:38:26.234641 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.234528 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerDied","Data":"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d"} Apr 16 16:38:26.284456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.284382 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:26.377523 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.377500 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:26.385033 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385000 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385199 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385043 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds\") pod \"ef4c0643-710f-40f7-b5a3-59977c385ef9\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " Apr 16 16:38:26.385199 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385126 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rfv7\" (UniqueName: \"kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385199 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385159 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location\") pod \"ef4c0643-710f-40f7-b5a3-59977c385ef9\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " Apr 16 16:38:26.385199 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385174 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gsrp\" (UniqueName: \"kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp\") pod \"ef4c0643-710f-40f7-b5a3-59977c385ef9\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " Apr 16 16:38:26.385421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385209 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385291 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385359 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs\") pod \"ef4c0643-710f-40f7-b5a3-59977c385ef9\" (UID: \"ef4c0643-710f-40f7-b5a3-59977c385ef9\") " Apr 16 16:38:26.385421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385398 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385431 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location\") pod \"53d6b444-fcd9-4396-a25b-210015f70028\" (UID: \"53d6b444-fcd9-4396-a25b-210015f70028\") " Apr 16 16:38:26.385638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385356 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ef4c0643-710f-40f7-b5a3-59977c385ef9" (UID: "ef4c0643-710f-40f7-b5a3-59977c385ef9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.385638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385451 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home" (OuterVolumeSpecName: "home") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.385638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385550 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache" (OuterVolumeSpecName: "model-cache") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.385842 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385827 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.385898 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385846 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.385898 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385862 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.386003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.385976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef4c0643-710f-40f7-b5a3-59977c385ef9" (UID: "ef4c0643-710f-40f7-b5a3-59977c385ef9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.387699 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.387639 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7" (OuterVolumeSpecName: "kube-api-access-4rfv7") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "kube-api-access-4rfv7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:38:26.387824 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.387694 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ef4c0643-710f-40f7-b5a3-59977c385ef9" (UID: "ef4c0643-710f-40f7-b5a3-59977c385ef9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:38:26.387824 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.387747 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp" (OuterVolumeSpecName: "kube-api-access-7gsrp") pod "ef4c0643-710f-40f7-b5a3-59977c385ef9" (UID: "ef4c0643-710f-40f7-b5a3-59977c385ef9"). InnerVolumeSpecName "kube-api-access-7gsrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:38:26.387824 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.387760 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:38:26.388094 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.388079 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm" (OuterVolumeSpecName: "dshm") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.445484 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.445422 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "53d6b444-fcd9-4396-a25b-210015f70028" (UID: "53d6b444-fcd9-4396-a25b-210015f70028"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:38:26.487322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487276 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ef4c0643-710f-40f7-b5a3-59977c385ef9-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487313 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487327 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/53d6b444-fcd9-4396-a25b-210015f70028-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487340 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/53d6b444-fcd9-4396-a25b-210015f70028-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487352 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4rfv7\" (UniqueName: \"kubernetes.io/projected/53d6b444-fcd9-4396-a25b-210015f70028-kube-api-access-4rfv7\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487365 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef4c0643-710f-40f7-b5a3-59977c385ef9-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:26.487555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:26.487377 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gsrp\" (UniqueName: \"kubernetes.io/projected/ef4c0643-710f-40f7-b5a3-59977c385ef9-kube-api-access-7gsrp\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:38:27.240129 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.240090 2571 generic.go:358] "Generic (PLEG): container finished" podID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerID="4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0" exitCode=0 Apr 16 16:38:27.240584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.240187 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" Apr 16 16:38:27.240584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.240184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerDied","Data":"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0"} Apr 16 16:38:27.240584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.240289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4" event={"ID":"ef4c0643-710f-40f7-b5a3-59977c385ef9","Type":"ContainerDied","Data":"03824a0a72dda9020c0b81c741c286eabbbd75fa94e03750539456a4fb109d71"} Apr 16 16:38:27.240584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.240311 2571 scope.go:117] "RemoveContainer" containerID="4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0" Apr 16 16:38:27.241932 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.241907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" event={"ID":"53d6b444-fcd9-4396-a25b-210015f70028","Type":"ContainerDied","Data":"e94332415e79791f3614d94fe8baa4f495d3958336e5a353e4e694e1c0234a93"} Apr 16 16:38:27.242035 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.241933 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2" Apr 16 16:38:27.249693 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.249674 2571 scope.go:117] "RemoveContainer" containerID="b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d" Apr 16 16:38:27.258162 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.258137 2571 scope.go:117] "RemoveContainer" containerID="4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca" Apr 16 16:38:27.260977 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.260954 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:38:27.265065 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.265039 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-84cf845rzwp4"] Apr 16 16:38:27.267373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.267356 2571 scope.go:117] "RemoveContainer" containerID="4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0" Apr 16 16:38:27.267649 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:38:27.267631 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0\": container with ID starting with 4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0 not found: ID does not exist" containerID="4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0" Apr 16 16:38:27.267717 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.267655 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0"} err="failed to get container status \"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0\": rpc error: code = NotFound desc = could not find container \"4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0\": container with ID starting with 4d057fa498248c61340a5d1f0c3a7a7c101437ed671ed7ae44674d6c4b39a1f0 not found: ID does not exist" Apr 16 16:38:27.267717 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.267673 2571 scope.go:117] "RemoveContainer" containerID="b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d" Apr 16 16:38:27.267890 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:38:27.267875 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d\": container with ID starting with b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d not found: ID does not exist" containerID="b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d" Apr 16 16:38:27.267927 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.267896 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d"} err="failed to get container status \"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d\": rpc error: code = NotFound desc = could not find container \"b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d\": container with ID starting with b69efcc2e0aeddac7e7144af066b94655a9e6b32bcb4dbefa5e76ef9b59fff9d not found: ID does not exist" Apr 16 16:38:27.267927 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.267910 2571 scope.go:117] "RemoveContainer" containerID="4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca" Apr 16 16:38:27.268148 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:38:27.268131 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca\": container with ID starting with 4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca not found: ID does not exist" containerID="4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca" Apr 16 16:38:27.268188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.268152 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca"} err="failed to get container status \"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca\": rpc error: code = NotFound desc = could not find container \"4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca\": container with ID starting with 4ad5325e29d0659479bd39193764c5d5116cb879046eb34fe3bb87eb18732aca not found: ID does not exist" Apr 16 16:38:27.268188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.268166 2571 scope.go:117] "RemoveContainer" containerID="04cf08b5d89e1c223ef6582786250892a88cda879a1aff1fbdb327261a8c9b8f" Apr 16 16:38:27.275474 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.275448 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:38:27.280101 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.279617 2571 scope.go:117] "RemoveContainer" containerID="1dd2d152ccfa8d11e63e8245d25299ba6f94cf9a13f3640966c625c3acefd2bc" Apr 16 16:38:27.283308 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:27.283283 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-c467c496f-t4nb2"] Apr 16 16:38:28.756362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:28.756318 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d6b444-fcd9-4396-a25b-210015f70028" path="/var/lib/kubelet/pods/53d6b444-fcd9-4396-a25b-210015f70028/volumes" Apr 16 16:38:28.756957 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:28.756936 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" path="/var/lib/kubelet/pods/ef4c0643-710f-40f7-b5a3-59977c385ef9/volumes" Apr 16 16:38:37.344820 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.344777 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345204 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="main" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345219 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="main" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345234 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="storage-initializer" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345243 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="storage-initializer" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345255 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="tokenizer" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345261 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="tokenizer" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345269 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="main" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345274 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="main" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345284 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="storage-initializer" Apr 16 16:38:37.345323 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345289 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="storage-initializer" Apr 16 16:38:37.345642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345354 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="tokenizer" Apr 16 16:38:37.345642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345364 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="53d6b444-fcd9-4396-a25b-210015f70028" containerName="main" Apr 16 16:38:37.345642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.345373 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef4c0643-710f-40f7-b5a3-59977c385ef9" containerName="main" Apr 16 16:38:37.350391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.350365 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.352755 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.352734 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:38:37.352755 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.352741 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 16:38:37.353838 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.353814 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:38:37.354052 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.353903 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:38:37.357559 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.357535 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:38:37.394771 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.394969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394793 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.394969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394838 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.394969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.394969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pws\" (UniqueName: \"kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.394969 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.394905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496330 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496338 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pws\" (UniqueName: \"kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496851 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496900 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496861 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.496936 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.496912 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.498636 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.498614 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.498873 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.498855 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.507949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.507919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pws\" (UniqueName: \"kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws\") pod \"precise-prefix-cache-test-kserve-554b968997-wxzbs\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.557190 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.557150 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:38:37.561462 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.561435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.563753 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.563726 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-tlkxz\"" Apr 16 16:38:37.573244 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.573216 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:38:37.663428 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.663385 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:37.698615 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.698573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn4kw\" (UniqueName: \"kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.698782 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.698675 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.698838 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.698788 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.698838 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.698819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799564 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799574 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn4kw\" (UniqueName: \"kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799742 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799906 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.799988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.799965 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.802342 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.802314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.807251 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.807226 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn4kw\" (UniqueName: \"kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw\") pod \"precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:37.812881 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.812855 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:38:37.814276 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:38:37.814249 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5737296f_9d52_41f1_9a5b_16c5fe82910e.slice/crio-ce8b689bc7029f0d4c35aa7f8d5e918220fa5528a1bb78d66ab15078879a8dc3 WatchSource:0}: Error finding container ce8b689bc7029f0d4c35aa7f8d5e918220fa5528a1bb78d66ab15078879a8dc3: Status 404 returned error can't find the container with id ce8b689bc7029f0d4c35aa7f8d5e918220fa5528a1bb78d66ab15078879a8dc3 Apr 16 16:38:37.872472 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:37.872444 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:38.004951 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:38.004923 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:38:38.007601 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:38:38.007569 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d498ce_9649_4ad8_92c5_b754631e03f0.slice/crio-744814ad96ee6b8ef9437c0689308898dc73955d92517e398d97fc4f38f616c0 WatchSource:0}: Error finding container 744814ad96ee6b8ef9437c0689308898dc73955d92517e398d97fc4f38f616c0: Status 404 returned error can't find the container with id 744814ad96ee6b8ef9437c0689308898dc73955d92517e398d97fc4f38f616c0 Apr 16 16:38:38.289995 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:38.289693 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerStarted","Data":"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86"} Apr 16 16:38:38.289995 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:38.289745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerStarted","Data":"744814ad96ee6b8ef9437c0689308898dc73955d92517e398d97fc4f38f616c0"} Apr 16 16:38:38.291284 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:38.291251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerStarted","Data":"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379"} Apr 16 16:38:38.291415 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:38.291348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerStarted","Data":"ce8b689bc7029f0d4c35aa7f8d5e918220fa5528a1bb78d66ab15078879a8dc3"} Apr 16 16:38:39.297189 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:39.297148 2571 generic.go:358] "Generic (PLEG): container finished" podID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerID="8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86" exitCode=0 Apr 16 16:38:39.297660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:39.297239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerDied","Data":"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86"} Apr 16 16:38:40.302928 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:40.302878 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerStarted","Data":"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4"} Apr 16 16:38:40.303311 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:40.302933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerStarted","Data":"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb"} Apr 16 16:38:40.303311 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:40.302963 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:40.326283 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:40.326226 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" podStartSLOduration=3.326208236 podStartE2EDuration="3.326208236s" podCreationTimestamp="2026-04-16 16:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:38:40.322001869 +0000 UTC m=+924.156854771" watchObservedRunningTime="2026-04-16 16:38:40.326208236 +0000 UTC m=+924.161061175" Apr 16 16:38:43.317640 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:43.317606 2571 generic.go:358] "Generic (PLEG): container finished" podID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerID="971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379" exitCode=0 Apr 16 16:38:43.318099 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:43.317686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerDied","Data":"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379"} Apr 16 16:38:44.324524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:44.324485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerStarted","Data":"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda"} Apr 16 16:38:44.346792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:44.346741 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" podStartSLOduration=7.346724182 podStartE2EDuration="7.346724182s" podCreationTimestamp="2026-04-16 16:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:38:44.343358558 +0000 UTC m=+928.178211472" watchObservedRunningTime="2026-04-16 16:38:44.346724182 +0000 UTC m=+928.181577117" Apr 16 16:38:47.664324 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.664279 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:47.664324 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.664328 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:47.677215 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.677189 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:38:47.873245 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.873202 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:47.873454 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.873359 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:47.876014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:47.875989 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:48.342815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:48.342784 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:38:48.352677 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:38:48.352653 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:39:10.350593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:10.350561 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:39:29.670368 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.670327 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:39:29.672864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.670617 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="main" containerID="cri-o://87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda" gracePeriod=30 Apr 16 16:39:29.681923 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.681883 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:39:29.682488 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.682305 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="main" containerID="cri-o://da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" gracePeriod=30 Apr 16 16:39:29.682488 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.682404 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="tokenizer" containerID="cri-o://41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" gracePeriod=30 Apr 16 16:39:29.944210 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:29.944183 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:39:30.030616 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.030589 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:39:30.071896 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.071858 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.071928 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.071963 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.071994 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072020 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pws\" (UniqueName: \"kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072203 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm\") pod \"5737296f-9d52-41f1-9a5b-16c5fe82910e\" (UID: \"5737296f-9d52-41f1-9a5b-16c5fe82910e\") " Apr 16 16:39:30.072399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072252 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache" (OuterVolumeSpecName: "model-cache") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.072399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072270 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home" (OuterVolumeSpecName: "home") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.072603 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072584 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.072672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.072606 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.074164 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.074137 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:39:30.074561 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.074525 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm" (OuterVolumeSpecName: "dshm") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.074673 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.074632 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws" (OuterVolumeSpecName: "kube-api-access-l7pws") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "kube-api-access-l7pws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:39:30.129761 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.129716 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5737296f-9d52-41f1-9a5b-16c5fe82910e" (UID: "5737296f-9d52-41f1-9a5b-16c5fe82910e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.174005 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.173904 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn4kw\" (UniqueName: \"kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw\") pod \"26d498ce-9649-4ad8-92c5-b754631e03f0\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " Apr 16 16:39:30.174005 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.173991 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds\") pod \"26d498ce-9649-4ad8-92c5-b754631e03f0\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " Apr 16 16:39:30.174289 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174028 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location\") pod \"26d498ce-9649-4ad8-92c5-b754631e03f0\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " Apr 16 16:39:30.174289 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174055 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs\") pod \"26d498ce-9649-4ad8-92c5-b754631e03f0\" (UID: \"26d498ce-9649-4ad8-92c5-b754631e03f0\") " Apr 16 16:39:30.174289 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174278 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7pws\" (UniqueName: \"kubernetes.io/projected/5737296f-9d52-41f1-9a5b-16c5fe82910e-kube-api-access-l7pws\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.174289 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174292 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.174456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174305 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5737296f-9d52-41f1-9a5b-16c5fe82910e-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.174456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174319 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5737296f-9d52-41f1-9a5b-16c5fe82910e-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.174456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174323 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "26d498ce-9649-4ad8-92c5-b754631e03f0" (UID: "26d498ce-9649-4ad8-92c5-b754631e03f0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.174793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.174769 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26d498ce-9649-4ad8-92c5-b754631e03f0" (UID: "26d498ce-9649-4ad8-92c5-b754631e03f0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:30.176209 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.176184 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw" (OuterVolumeSpecName: "kube-api-access-nn4kw") pod "26d498ce-9649-4ad8-92c5-b754631e03f0" (UID: "26d498ce-9649-4ad8-92c5-b754631e03f0"). InnerVolumeSpecName "kube-api-access-nn4kw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:39:30.176302 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.176211 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "26d498ce-9649-4ad8-92c5-b754631e03f0" (UID: "26d498ce-9649-4ad8-92c5-b754631e03f0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:39:30.275736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.275693 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.275736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.275730 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d498ce-9649-4ad8-92c5-b754631e03f0-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.275736 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.275743 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/26d498ce-9649-4ad8-92c5-b754631e03f0-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.275984 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.275754 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nn4kw\" (UniqueName: \"kubernetes.io/projected/26d498ce-9649-4ad8-92c5-b754631e03f0-kube-api-access-nn4kw\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:39:30.516347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.516271 2571 generic.go:358] "Generic (PLEG): container finished" podID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerID="87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda" exitCode=0 Apr 16 16:39:30.516487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.516347 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerDied","Data":"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda"} Apr 16 16:39:30.516487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.516362 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" Apr 16 16:39:30.516487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.516383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs" event={"ID":"5737296f-9d52-41f1-9a5b-16c5fe82910e","Type":"ContainerDied","Data":"ce8b689bc7029f0d4c35aa7f8d5e918220fa5528a1bb78d66ab15078879a8dc3"} Apr 16 16:39:30.516487 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.516405 2571 scope.go:117] "RemoveContainer" containerID="87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda" Apr 16 16:39:30.518590 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518558 2571 generic.go:358] "Generic (PLEG): container finished" podID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerID="41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" exitCode=0 Apr 16 16:39:30.518731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518599 2571 generic.go:358] "Generic (PLEG): container finished" podID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerID="da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" exitCode=0 Apr 16 16:39:30.518731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518666 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" Apr 16 16:39:30.518731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518677 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerDied","Data":"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4"} Apr 16 16:39:30.518731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerDied","Data":"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb"} Apr 16 16:39:30.518896 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.518745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w" event={"ID":"26d498ce-9649-4ad8-92c5-b754631e03f0","Type":"ContainerDied","Data":"744814ad96ee6b8ef9437c0689308898dc73955d92517e398d97fc4f38f616c0"} Apr 16 16:39:30.525949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.525932 2571 scope.go:117] "RemoveContainer" containerID="971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379" Apr 16 16:39:30.537353 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.537329 2571 scope.go:117] "RemoveContainer" containerID="87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda" Apr 16 16:39:30.537679 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:39:30.537647 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda\": container with ID starting with 87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda not found: ID does not exist" containerID="87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda" Apr 16 16:39:30.537766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.537685 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda"} err="failed to get container status \"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda\": rpc error: code = NotFound desc = could not find container \"87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda\": container with ID starting with 87519b00b5cff6d9ea66c0ae3d98a163f95f0e994f5fcc74cbb6c25388821eda not found: ID does not exist" Apr 16 16:39:30.537766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.537704 2571 scope.go:117] "RemoveContainer" containerID="971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379" Apr 16 16:39:30.537949 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:39:30.537931 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379\": container with ID starting with 971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379 not found: ID does not exist" containerID="971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379" Apr 16 16:39:30.538103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.537953 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379"} err="failed to get container status \"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379\": rpc error: code = NotFound desc = could not find container \"971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379\": container with ID starting with 971745db3f410f690dc3c91c4e07c8c62bc98f2d32427b3d13d0f91a6ccd2379 not found: ID does not exist" Apr 16 16:39:30.538103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.537966 2571 scope.go:117] "RemoveContainer" containerID="41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" Apr 16 16:39:30.546892 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.546867 2571 scope.go:117] "RemoveContainer" containerID="da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" Apr 16 16:39:30.553634 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.553604 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:39:30.557682 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.557664 2571 scope.go:117] "RemoveContainer" containerID="8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86" Apr 16 16:39:30.559882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.559859 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-554b968997-wxzbs"] Apr 16 16:39:30.569771 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.569746 2571 scope.go:117] "RemoveContainer" containerID="41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" Apr 16 16:39:30.570358 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:39:30.570331 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4\": container with ID starting with 41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4 not found: ID does not exist" containerID="41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" Apr 16 16:39:30.570445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570364 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4"} err="failed to get container status \"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4\": rpc error: code = NotFound desc = could not find container \"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4\": container with ID starting with 41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4 not found: ID does not exist" Apr 16 16:39:30.570445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570387 2571 scope.go:117] "RemoveContainer" containerID="da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" Apr 16 16:39:30.570678 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:39:30.570655 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb\": container with ID starting with da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb not found: ID does not exist" containerID="da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" Apr 16 16:39:30.570734 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570688 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb"} err="failed to get container status \"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb\": rpc error: code = NotFound desc = could not find container \"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb\": container with ID starting with da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb not found: ID does not exist" Apr 16 16:39:30.570734 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570713 2571 scope.go:117] "RemoveContainer" containerID="8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86" Apr 16 16:39:30.570949 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:39:30.570932 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86\": container with ID starting with 8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86 not found: ID does not exist" containerID="8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86" Apr 16 16:39:30.570993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570954 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86"} err="failed to get container status \"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86\": rpc error: code = NotFound desc = could not find container \"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86\": container with ID starting with 8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86 not found: ID does not exist" Apr 16 16:39:30.570993 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.570973 2571 scope.go:117] "RemoveContainer" containerID="41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4" Apr 16 16:39:30.571266 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.571241 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4"} err="failed to get container status \"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4\": rpc error: code = NotFound desc = could not find container \"41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4\": container with ID starting with 41a732f18d1dbc2d1b546ed36b1324390398f11ca58f5c851d0f1d9b36fb99c4 not found: ID does not exist" Apr 16 16:39:30.571365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.571268 2571 scope.go:117] "RemoveContainer" containerID="da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb" Apr 16 16:39:30.571966 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.571939 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb"} err="failed to get container status \"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb\": rpc error: code = NotFound desc = could not find container \"da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb\": container with ID starting with da5e3b05fd003fc8646b0f7cacda77921506dfa8b42551a02a8a305829896ceb not found: ID does not exist" Apr 16 16:39:30.571966 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.571966 2571 scope.go:117] "RemoveContainer" containerID="8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86" Apr 16 16:39:30.572452 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.572325 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86"} err="failed to get container status \"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86\": rpc error: code = NotFound desc = could not find container \"8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86\": container with ID starting with 8e1a03f7728cbec740414caecbd66b74b5645dcffd9a115de0cfba36c9a55d86 not found: ID does not exist" Apr 16 16:39:30.574519 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.574496 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:39:30.579270 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.579243 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-85554c459kw9w"] Apr 16 16:39:30.762765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.757933 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" path="/var/lib/kubelet/pods/26d498ce-9649-4ad8-92c5-b754631e03f0/volumes" Apr 16 16:39:30.762765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:30.758707 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" path="/var/lib/kubelet/pods/5737296f-9d52-41f1-9a5b-16c5fe82910e/volumes" Apr 16 16:39:45.524439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524347 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:39:45.524923 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524898 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="main" Apr 16 16:39:45.524923 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524918 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="main" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524931 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="main" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524936 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="main" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524956 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="storage-initializer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524963 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="storage-initializer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524970 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="tokenizer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524976 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="tokenizer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524983 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="storage-initializer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.524989 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="storage-initializer" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.525068 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="main" Apr 16 16:39:45.525069 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.525077 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5737296f-9d52-41f1-9a5b-16c5fe82910e" containerName="main" Apr 16 16:39:45.525595 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.525084 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d498ce-9649-4ad8-92c5-b754631e03f0" containerName="tokenizer" Apr 16 16:39:45.528264 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.528244 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.530853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.530810 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-7qrr8\"" Apr 16 16:39:45.530853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.530826 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:39:45.531076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.530871 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:39:45.531076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.530814 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:39:45.539640 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.539608 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:39:45.612167 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.612094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.612344 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.612207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9wdb\" (UniqueName: \"kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.612344 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.612272 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.612344 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.612325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713452 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713409 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713465 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9wdb\" (UniqueName: \"kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713874 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713854 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.713926 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.713903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.715985 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.715964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.721971 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.721944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9wdb\" (UniqueName: \"kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.840911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.840811 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:45.976537 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:45.976504 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:39:45.978251 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:39:45.978216 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4ce878e_479e_48fb_9f0f_8b281cf87328.slice/crio-1bce871c53f0120afad2a5cca06418ffa056ba0d2d45d1ccc807da7b2e66e13d WatchSource:0}: Error finding container 1bce871c53f0120afad2a5cca06418ffa056ba0d2d45d1ccc807da7b2e66e13d: Status 404 returned error can't find the container with id 1bce871c53f0120afad2a5cca06418ffa056ba0d2d45d1ccc807da7b2e66e13d Apr 16 16:39:46.588766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:46.588721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerStarted","Data":"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9"} Apr 16 16:39:46.588766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:46.588769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerStarted","Data":"1bce871c53f0120afad2a5cca06418ffa056ba0d2d45d1ccc807da7b2e66e13d"} Apr 16 16:39:47.594486 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:47.594449 2571 generic.go:358] "Generic (PLEG): container finished" podID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerID="e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9" exitCode=0 Apr 16 16:39:47.594868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:47.594537 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerDied","Data":"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9"} Apr 16 16:39:48.601478 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:48.601436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerStarted","Data":"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8"} Apr 16 16:39:48.601478 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:48.601484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerStarted","Data":"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d"} Apr 16 16:39:48.602018 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:48.601517 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:48.624455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:48.624403 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" podStartSLOduration=3.624386466 podStartE2EDuration="3.624386466s" podCreationTimestamp="2026-04-16 16:39:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:39:48.621737514 +0000 UTC m=+992.456590418" watchObservedRunningTime="2026-04-16 16:39:48.624386466 +0000 UTC m=+992.459239405" Apr 16 16:39:55.841943 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:55.841900 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:55.841943 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:55.841952 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:55.844788 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:55.844764 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:39:56.633225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:39:56.633197 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:40:17.638022 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:40:17.637992 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:41:36.636053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:36.635963 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:41:36.636572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:36.636407 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="main" containerID="cri-o://220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" gracePeriod=30 Apr 16 16:41:36.636572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:36.636440 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="tokenizer" containerID="cri-o://79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" gracePeriod=30 Apr 16 16:41:36.999498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:36.999470 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:41:37.024930 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.024897 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9wdb\" (UniqueName: \"kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb\") pod \"b4ce878e-479e-48fb-9f0f-8b281cf87328\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " Apr 16 16:41:37.025165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.024993 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs\") pod \"b4ce878e-479e-48fb-9f0f-8b281cf87328\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " Apr 16 16:41:37.025165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.025034 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location\") pod \"b4ce878e-479e-48fb-9f0f-8b281cf87328\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " Apr 16 16:41:37.025165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.025154 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds\") pod \"b4ce878e-479e-48fb-9f0f-8b281cf87328\" (UID: \"b4ce878e-479e-48fb-9f0f-8b281cf87328\") " Apr 16 16:41:37.025587 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.025561 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b4ce878e-479e-48fb-9f0f-8b281cf87328" (UID: "b4ce878e-479e-48fb-9f0f-8b281cf87328"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:37.026219 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026177 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b4ce878e-479e-48fb-9f0f-8b281cf87328" (UID: "b4ce878e-479e-48fb-9f0f-8b281cf87328"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:37.026711 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026684 2571 generic.go:358] "Generic (PLEG): container finished" podID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerID="79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" exitCode=0 Apr 16 16:41:37.026822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026712 2571 generic.go:358] "Generic (PLEG): container finished" podID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerID="220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" exitCode=0 Apr 16 16:41:37.026913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026895 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerDied","Data":"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8"} Apr 16 16:41:37.026978 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026932 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerDied","Data":"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d"} Apr 16 16:41:37.026978 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" event={"ID":"b4ce878e-479e-48fb-9f0f-8b281cf87328","Type":"ContainerDied","Data":"1bce871c53f0120afad2a5cca06418ffa056ba0d2d45d1ccc807da7b2e66e13d"} Apr 16 16:41:37.026978 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.026969 2571 scope.go:117] "RemoveContainer" containerID="79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" Apr 16 16:41:37.027447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.027360 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p" Apr 16 16:41:37.027603 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.027565 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb" (OuterVolumeSpecName: "kube-api-access-c9wdb") pod "b4ce878e-479e-48fb-9f0f-8b281cf87328" (UID: "b4ce878e-479e-48fb-9f0f-8b281cf87328"). InnerVolumeSpecName "kube-api-access-c9wdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:41:37.027798 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.027774 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b4ce878e-479e-48fb-9f0f-8b281cf87328" (UID: "b4ce878e-479e-48fb-9f0f-8b281cf87328"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:41:37.041947 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.041925 2571 scope.go:117] "RemoveContainer" containerID="220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" Apr 16 16:41:37.050760 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.050736 2571 scope.go:117] "RemoveContainer" containerID="e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9" Apr 16 16:41:37.059185 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059164 2571 scope.go:117] "RemoveContainer" containerID="79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" Apr 16 16:41:37.059498 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:37.059479 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8\": container with ID starting with 79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8 not found: ID does not exist" containerID="79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" Apr 16 16:41:37.059562 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059508 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8"} err="failed to get container status \"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8\": rpc error: code = NotFound desc = could not find container \"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8\": container with ID starting with 79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8 not found: ID does not exist" Apr 16 16:41:37.059562 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059529 2571 scope.go:117] "RemoveContainer" containerID="220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" Apr 16 16:41:37.059740 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:37.059722 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d\": container with ID starting with 220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d not found: ID does not exist" containerID="220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" Apr 16 16:41:37.059781 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059744 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d"} err="failed to get container status \"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d\": rpc error: code = NotFound desc = could not find container \"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d\": container with ID starting with 220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d not found: ID does not exist" Apr 16 16:41:37.059781 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059757 2571 scope.go:117] "RemoveContainer" containerID="e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9" Apr 16 16:41:37.059938 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:37.059922 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9\": container with ID starting with e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9 not found: ID does not exist" containerID="e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9" Apr 16 16:41:37.059978 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059942 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9"} err="failed to get container status \"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9\": rpc error: code = NotFound desc = could not find container \"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9\": container with ID starting with e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9 not found: ID does not exist" Apr 16 16:41:37.059978 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.059953 2571 scope.go:117] "RemoveContainer" containerID="79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8" Apr 16 16:41:37.060171 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.060152 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8"} err="failed to get container status \"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8\": rpc error: code = NotFound desc = could not find container \"79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8\": container with ID starting with 79a3d01dafaf1cb6d146b090d7a6852ae33db8afda5155558defdf9b61c1a7b8 not found: ID does not exist" Apr 16 16:41:37.060220 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.060172 2571 scope.go:117] "RemoveContainer" containerID="220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d" Apr 16 16:41:37.060383 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.060359 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d"} err="failed to get container status \"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d\": rpc error: code = NotFound desc = could not find container \"220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d\": container with ID starting with 220f163f8d85df07d699f3fcaee204aa839e92c09766d93b60e0f19de7e4194d not found: ID does not exist" Apr 16 16:41:37.060456 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.060385 2571 scope.go:117] "RemoveContainer" containerID="e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9" Apr 16 16:41:37.060604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.060584 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9"} err="failed to get container status \"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9\": rpc error: code = NotFound desc = could not find container \"e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9\": container with ID starting with e6189217e33fec27f26fb111290dda640b4d4592e8d45fcd2052070d94bb94a9 not found: ID does not exist" Apr 16 16:41:37.126501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.126468 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:41:37.126501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.126499 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b4ce878e-479e-48fb-9f0f-8b281cf87328-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:41:37.126501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.126510 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9wdb\" (UniqueName: \"kubernetes.io/projected/b4ce878e-479e-48fb-9f0f-8b281cf87328-kube-api-access-c9wdb\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:41:37.126501 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.126519 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b4ce878e-479e-48fb-9f0f-8b281cf87328-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:41:37.350908 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.350877 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:41:37.355380 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:37.355351 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-d5k7p"] Apr 16 16:41:38.756262 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:38.756231 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" path="/var/lib/kubelet/pods/b4ce878e-479e-48fb-9f0f-8b281cf87328/volumes" Apr 16 16:41:56.645406 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645365 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645766 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="storage-initializer" Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645778 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="storage-initializer" Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645787 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="tokenizer" Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645793 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="tokenizer" Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645803 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="main" Apr 16 16:41:56.645840 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645808 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="main" Apr 16 16:41:56.646031 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645872 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="main" Apr 16 16:41:56.646031 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.645882 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4ce878e-479e-48fb-9f0f-8b281cf87328" containerName="tokenizer" Apr 16 16:41:56.649146 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.649107 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.656663 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:56.656555 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"stop-feature-test-epp-sa-dockercfg-sxxng\" is forbidden: User \"system:node:ip-10-0-138-125.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-138-125.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-sxxng\"" type="*v1.Secret" Apr 16 16:41:56.656744 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:56.656660 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-138-125.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-138-125.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 16 16:41:56.658135 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.658069 2571 status_manager.go:895] "Failed to get status for pod" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" err="pods \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" is forbidden: User \"system:node:ip-10-0-138-125.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-138-125.ec2.internal' and this object" Apr 16 16:41:56.658135 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:56.658095 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-138-125.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-138-125.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 16 16:41:56.658963 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:56.658940 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"stop-feature-test-kserve-self-signed-certs\" is forbidden: User \"system:node:ip-10-0-138-125.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-138-125.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" type="*v1.Secret" Apr 16 16:41:56.701886 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.701848 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:41:56.808470 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.808428 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.808470 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.808476 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.808729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.808534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.808729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.808640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.909499 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.909466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.909696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.909504 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.909696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.909543 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.909867 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.909845 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.909941 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.909927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:56.910244 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:56.910227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:57.866208 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:57.866178 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:41:57.910656 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:57.910621 2571 secret.go:189] Couldn't get secret kserve-ci-e2e-test/stop-feature-test-kserve-self-signed-certs: failed to sync secret cache: timed out waiting for the condition Apr 16 16:41:57.910817 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:57.910715 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs podName:8533e0ad-71cd-49b2-b6d1-499d7324584c nodeName:}" failed. No retries permitted until 2026-04-16 16:41:58.410697852 +0000 UTC m=+1122.245550734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs") pod "stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c") : failed to sync secret cache: timed out waiting for the condition Apr 16 16:41:57.932613 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:57.932575 2571 projected.go:289] Couldn't get configMap kserve-ci-e2e-test/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 16 16:41:57.932613 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:57.932618 2571 projected.go:194] Error preparing data for projected volume kube-api-access-cfrxz for pod kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b: failed to sync configmap cache: timed out waiting for the condition Apr 16 16:41:57.932820 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:41:57.932677 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz podName:8533e0ad-71cd-49b2-b6d1-499d7324584c nodeName:}" failed. No retries permitted until 2026-04-16 16:41:58.432660094 +0000 UTC m=+1122.267512976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cfrxz" (UniqueName: "kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz") pod "stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c") : failed to sync configmap cache: timed out waiting for the condition Apr 16 16:41:57.943097 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:57.943068 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-sxxng\"" Apr 16 16:41:58.130524 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.130433 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:41:58.150731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.150701 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:41:58.423453 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.423418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:58.425924 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.425894 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:58.524571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.524531 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:58.527082 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.527050 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") pod \"stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:58.759165 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.759076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:41:58.902417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.902384 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:41:58.904057 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:41:58.904029 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8533e0ad_71cd_49b2_b6d1_499d7324584c.slice/crio-5d9abfdbcbbfb243116168f634a17f6bd3fad04bade89c3282da7740ad4e5b67 WatchSource:0}: Error finding container 5d9abfdbcbbfb243116168f634a17f6bd3fad04bade89c3282da7740ad4e5b67: Status 404 returned error can't find the container with id 5d9abfdbcbbfb243116168f634a17f6bd3fad04bade89c3282da7740ad4e5b67 Apr 16 16:41:58.906263 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:58.906242 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:41:59.120766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:59.120670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerStarted","Data":"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7"} Apr 16 16:41:59.120766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:41:59.120713 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerStarted","Data":"5d9abfdbcbbfb243116168f634a17f6bd3fad04bade89c3282da7740ad4e5b67"} Apr 16 16:42:00.125651 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:00.125562 2571 generic.go:358] "Generic (PLEG): container finished" podID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerID="ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7" exitCode=0 Apr 16 16:42:00.126020 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:00.125645 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerDied","Data":"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7"} Apr 16 16:42:01.131766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:01.131730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerStarted","Data":"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3"} Apr 16 16:42:01.131766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:01.131770 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerStarted","Data":"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717"} Apr 16 16:42:01.132240 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:01.131837 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:42:01.153988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:01.153916 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" podStartSLOduration=5.153896586 podStartE2EDuration="5.153896586s" podCreationTimestamp="2026-04-16 16:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:42:01.153253942 +0000 UTC m=+1124.988106845" watchObservedRunningTime="2026-04-16 16:42:01.153896586 +0000 UTC m=+1124.988749491" Apr 16 16:42:08.759827 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:08.759793 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:42:08.759827 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:08.759843 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:42:08.763345 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:08.763056 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:42:09.163144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:09.163095 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:42:30.167319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:42:30.167285 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:43:16.761887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:16.761860 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:43:16.765538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:16.765511 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:43:16.765733 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:16.765716 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:43:16.769380 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:16.769362 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:43:47.914224 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:47.914174 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:43:47.914796 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:47.914581 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="main" containerID="cri-o://03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" gracePeriod=30 Apr 16 16:43:47.914796 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:47.914671 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="tokenizer" containerID="cri-o://c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" gracePeriod=30 Apr 16 16:43:48.267794 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.267768 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:43:48.324168 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.324132 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") pod \"8533e0ad-71cd-49b2-b6d1-499d7324584c\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " Apr 16 16:43:48.324359 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.324212 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") pod \"8533e0ad-71cd-49b2-b6d1-499d7324584c\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " Apr 16 16:43:48.324359 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.324298 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds\") pod \"8533e0ad-71cd-49b2-b6d1-499d7324584c\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " Apr 16 16:43:48.324359 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.324325 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location\") pod \"8533e0ad-71cd-49b2-b6d1-499d7324584c\" (UID: \"8533e0ad-71cd-49b2-b6d1-499d7324584c\") " Apr 16 16:43:48.324584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.324557 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8533e0ad-71cd-49b2-b6d1-499d7324584c" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:43:48.325082 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.325057 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8533e0ad-71cd-49b2-b6d1-499d7324584c" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:43:48.326360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.326335 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz" (OuterVolumeSpecName: "kube-api-access-cfrxz") pod "8533e0ad-71cd-49b2-b6d1-499d7324584c" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c"). InnerVolumeSpecName "kube-api-access-cfrxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:43:48.326419 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.326391 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8533e0ad-71cd-49b2-b6d1-499d7324584c" (UID: "8533e0ad-71cd-49b2-b6d1-499d7324584c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:43:48.425706 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.425666 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:43:48.425706 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.425701 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8533e0ad-71cd-49b2-b6d1-499d7324584c-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:43:48.425706 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.425713 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cfrxz\" (UniqueName: \"kubernetes.io/projected/8533e0ad-71cd-49b2-b6d1-499d7324584c-kube-api-access-cfrxz\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:43:48.425959 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.425725 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8533e0ad-71cd-49b2-b6d1-499d7324584c-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:43:48.560601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560507 2571 generic.go:358] "Generic (PLEG): container finished" podID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerID="c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" exitCode=0 Apr 16 16:43:48.560601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560533 2571 generic.go:358] "Generic (PLEG): container finished" podID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerID="03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" exitCode=0 Apr 16 16:43:48.560601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560579 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" Apr 16 16:43:48.560882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerDied","Data":"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3"} Apr 16 16:43:48.560882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerDied","Data":"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717"} Apr 16 16:43:48.560882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560656 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b" event={"ID":"8533e0ad-71cd-49b2-b6d1-499d7324584c","Type":"ContainerDied","Data":"5d9abfdbcbbfb243116168f634a17f6bd3fad04bade89c3282da7740ad4e5b67"} Apr 16 16:43:48.560882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.560674 2571 scope.go:117] "RemoveContainer" containerID="c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" Apr 16 16:43:48.570580 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.570561 2571 scope.go:117] "RemoveContainer" containerID="03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" Apr 16 16:43:48.579268 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.579247 2571 scope.go:117] "RemoveContainer" containerID="ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7" Apr 16 16:43:48.587496 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.587476 2571 scope.go:117] "RemoveContainer" containerID="c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" Apr 16 16:43:48.587784 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:43:48.587760 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3\": container with ID starting with c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3 not found: ID does not exist" containerID="c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" Apr 16 16:43:48.587833 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.587796 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3"} err="failed to get container status \"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3\": rpc error: code = NotFound desc = could not find container \"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3\": container with ID starting with c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3 not found: ID does not exist" Apr 16 16:43:48.587833 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.587815 2571 scope.go:117] "RemoveContainer" containerID="03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" Apr 16 16:43:48.588100 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:43:48.588082 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717\": container with ID starting with 03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717 not found: ID does not exist" containerID="03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" Apr 16 16:43:48.588205 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588105 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717"} err="failed to get container status \"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717\": rpc error: code = NotFound desc = could not find container \"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717\": container with ID starting with 03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717 not found: ID does not exist" Apr 16 16:43:48.588205 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588141 2571 scope.go:117] "RemoveContainer" containerID="ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7" Apr 16 16:43:48.588403 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:43:48.588386 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7\": container with ID starting with ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7 not found: ID does not exist" containerID="ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7" Apr 16 16:43:48.588443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588409 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7"} err="failed to get container status \"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7\": rpc error: code = NotFound desc = could not find container \"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7\": container with ID starting with ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7 not found: ID does not exist" Apr 16 16:43:48.588443 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588430 2571 scope.go:117] "RemoveContainer" containerID="c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3" Apr 16 16:43:48.588665 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588648 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3"} err="failed to get container status \"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3\": rpc error: code = NotFound desc = could not find container \"c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3\": container with ID starting with c1f022185fd99c884d1f8babc9df98240fc8001f5d2b85a8eccf85e61081e7c3 not found: ID does not exist" Apr 16 16:43:48.588707 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588666 2571 scope.go:117] "RemoveContainer" containerID="03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717" Apr 16 16:43:48.588881 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588860 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717"} err="failed to get container status \"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717\": rpc error: code = NotFound desc = could not find container \"03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717\": container with ID starting with 03ceb99584e62d6d59469b4f7fd8fdb52fd1a66edc80edaa09b2366f5f566717 not found: ID does not exist" Apr 16 16:43:48.588958 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.588883 2571 scope.go:117] "RemoveContainer" containerID="ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7" Apr 16 16:43:48.589103 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.589085 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7"} err="failed to get container status \"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7\": rpc error: code = NotFound desc = could not find container \"ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7\": container with ID starting with ec02faf621be4bfa29d09bb620aec7bdb061f1bb2f826bd2ed9b1c080cc4e6c7 not found: ID does not exist" Apr 16 16:43:48.594413 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.594378 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:43:48.600573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.600539 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7b969df585-rfb4b"] Apr 16 16:43:48.756228 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:48.756189 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" path="/var/lib/kubelet/pods/8533e0ad-71cd-49b2-b6d1-499d7324584c/volumes" Apr 16 16:43:50.486504 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486461 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-966449c67-9z4cp"] Apr 16 16:43:50.486887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486856 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="tokenizer" Apr 16 16:43:50.486887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486868 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="tokenizer" Apr 16 16:43:50.486887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486882 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="storage-initializer" Apr 16 16:43:50.486887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486887 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="storage-initializer" Apr 16 16:43:50.487019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486894 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="main" Apr 16 16:43:50.487019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486899 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="main" Apr 16 16:43:50.487019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486982 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="main" Apr 16 16:43:50.487019 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.486990 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="8533e0ad-71cd-49b2-b6d1-499d7324584c" containerName="tokenizer" Apr 16 16:43:50.491757 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.491738 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.504307 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.504275 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-966449c67-9z4cp"] Apr 16 16:43:50.545108 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.545067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee09a009-c8f0-4fba-9206-58596c2a7b93-cert\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.545302 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.545191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhxxf\" (UniqueName: \"kubernetes.io/projected/ee09a009-c8f0-4fba-9206-58596c2a7b93-kube-api-access-zhxxf\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.646133 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.646091 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee09a009-c8f0-4fba-9206-58596c2a7b93-cert\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.646336 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.646200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhxxf\" (UniqueName: \"kubernetes.io/projected/ee09a009-c8f0-4fba-9206-58596c2a7b93-kube-api-access-zhxxf\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.648464 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.648438 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee09a009-c8f0-4fba-9206-58596c2a7b93-cert\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.654914 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.654877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhxxf\" (UniqueName: \"kubernetes.io/projected/ee09a009-c8f0-4fba-9206-58596c2a7b93-kube-api-access-zhxxf\") pod \"llmisvc-controller-manager-966449c67-9z4cp\" (UID: \"ee09a009-c8f0-4fba-9206-58596c2a7b93\") " pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.802469 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.802368 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:50.933774 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:50.933738 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-966449c67-9z4cp"] Apr 16 16:43:50.935264 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:43:50.935238 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podee09a009_c8f0_4fba_9206_58596c2a7b93.slice/crio-c74b20eee4b10ced54806d14dd43c625e63b19ffff3598a9d19d7a6982894d5e WatchSource:0}: Error finding container c74b20eee4b10ced54806d14dd43c625e63b19ffff3598a9d19d7a6982894d5e: Status 404 returned error can't find the container with id c74b20eee4b10ced54806d14dd43c625e63b19ffff3598a9d19d7a6982894d5e Apr 16 16:43:51.573364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:51.573328 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" event={"ID":"ee09a009-c8f0-4fba-9206-58596c2a7b93","Type":"ContainerStarted","Data":"449ad20c15524db90a2e4b55ced68e424bd25040d5752e85e02138712c590922"} Apr 16 16:43:51.573364 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:51.573367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" event={"ID":"ee09a009-c8f0-4fba-9206-58596c2a7b93","Type":"ContainerStarted","Data":"c74b20eee4b10ced54806d14dd43c625e63b19ffff3598a9d19d7a6982894d5e"} Apr 16 16:43:51.573834 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:51.573449 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:43:51.595869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:43:51.595809 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" podStartSLOduration=1.125901725 podStartE2EDuration="1.595793371s" podCreationTimestamp="2026-04-16 16:43:50 +0000 UTC" firstStartedPulling="2026-04-16 16:43:50.936549696 +0000 UTC m=+1234.771402578" lastFinishedPulling="2026-04-16 16:43:51.406441335 +0000 UTC m=+1235.241294224" observedRunningTime="2026-04-16 16:43:51.593455028 +0000 UTC m=+1235.428307944" watchObservedRunningTime="2026-04-16 16:43:51.595793371 +0000 UTC m=+1235.430646274" Apr 16 16:44:22.580431 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.580397 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-966449c67-9z4cp" Apr 16 16:44:22.630639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.630593 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:44:22.630928 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.630898 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" podUID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" containerName="manager" containerID="cri-o://4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb" gracePeriod=30 Apr 16 16:44:22.884417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.884390 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:44:22.946851 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.946808 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mjr\" (UniqueName: \"kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr\") pod \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " Apr 16 16:44:22.947060 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.946929 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert\") pod \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\" (UID: \"ca0915d6-9e74-4495-a517-0eaf62c5ef18\") " Apr 16 16:44:22.948936 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.948901 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr" (OuterVolumeSpecName: "kube-api-access-28mjr") pod "ca0915d6-9e74-4495-a517-0eaf62c5ef18" (UID: "ca0915d6-9e74-4495-a517-0eaf62c5ef18"). InnerVolumeSpecName "kube-api-access-28mjr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:44:22.948936 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:22.948904 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert" (OuterVolumeSpecName: "cert") pod "ca0915d6-9e74-4495-a517-0eaf62c5ef18" (UID: "ca0915d6-9e74-4495-a517-0eaf62c5ef18"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:44:23.047757 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.047717 2571 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ca0915d6-9e74-4495-a517-0eaf62c5ef18-cert\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:44:23.047757 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.047750 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28mjr\" (UniqueName: \"kubernetes.io/projected/ca0915d6-9e74-4495-a517-0eaf62c5ef18-kube-api-access-28mjr\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:44:23.714553 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.714513 2571 generic.go:358] "Generic (PLEG): container finished" podID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" containerID="4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb" exitCode=0 Apr 16 16:44:23.714999 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.714586 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" Apr 16 16:44:23.714999 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.714594 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" event={"ID":"ca0915d6-9e74-4495-a517-0eaf62c5ef18","Type":"ContainerDied","Data":"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb"} Apr 16 16:44:23.714999 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.714635 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-dcccd46b9-qvj2p" event={"ID":"ca0915d6-9e74-4495-a517-0eaf62c5ef18","Type":"ContainerDied","Data":"a22cd03f24c53fe16151c430473a1f36fa6738eec83c67a7db7182053b28488c"} Apr 16 16:44:23.714999 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.714651 2571 scope.go:117] "RemoveContainer" containerID="4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb" Apr 16 16:44:23.724413 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.724391 2571 scope.go:117] "RemoveContainer" containerID="4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb" Apr 16 16:44:23.724723 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:44:23.724702 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb\": container with ID starting with 4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb not found: ID does not exist" containerID="4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb" Apr 16 16:44:23.724807 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.724738 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb"} err="failed to get container status \"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb\": rpc error: code = NotFound desc = could not find container \"4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb\": container with ID starting with 4a5ce0b7b60a2c6d45e6754bd316cfd7ed948791c05113c83906e727891d73fb not found: ID does not exist" Apr 16 16:44:23.736173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.736134 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:44:23.740100 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:23.740071 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-dcccd46b9-qvj2p"] Apr 16 16:44:24.756107 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:44:24.756073 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" path="/var/lib/kubelet/pods/ca0915d6-9e74-4495-a517-0eaf62c5ef18/volumes" Apr 16 16:48:16.797092 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:16.797066 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:48:16.800708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:16.800674 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:48:16.801079 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:16.801060 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:48:16.804765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:16.804749 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:48:17.342373 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.342333 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:48:17.342729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.342717 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" containerName="manager" Apr 16 16:48:17.342787 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.342730 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" containerName="manager" Apr 16 16:48:17.342822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.342802 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca0915d6-9e74-4495-a517-0eaf62c5ef18" containerName="manager" Apr 16 16:48:17.346062 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.346036 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.350335 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.350314 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:48:17.350430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.350314 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 16:48:17.350725 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.350706 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:48:17.351053 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.351038 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-7p4cb\"" Apr 16 16:48:17.367678 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.365209 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:48:17.412944 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.412906 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:48:17.417547 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.417520 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.420450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.420425 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-q64j6\"" Apr 16 16:48:17.429818 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.429789 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:48:17.470740 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.470912 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.470912 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470778 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.470912 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.470912 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470871 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.471057 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.470952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbknh\" (UniqueName: \"kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.571925 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.571880 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.571925 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.571933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.571985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.572191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572063 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572145 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.572191 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572178 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572444 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572444 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8n2c\" (UniqueName: \"kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.572444 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbknh\" (UniqueName: \"kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572454 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572494 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.572596 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.572522 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.574906 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.574879 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.575030 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.574954 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.582512 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.582481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbknh\" (UniqueName: \"kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.656869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.656839 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:48:17.673916 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.673882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.674043 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.673937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.674043 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.673970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8n2c\" (UniqueName: \"kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.674043 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.674025 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.674379 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.674353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.674446 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.674371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.676606 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.676581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.685642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.685610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8n2c\" (UniqueName: \"kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.729353 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.729315 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:17.803390 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.803355 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:48:17.805773 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:48:17.805745 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfffd00d2_3be9_4fde_b268_871b48f1bcf1.slice/crio-fcbe2b12815d30f99729475fefda5acf268da8333a750ef5ca62526337516f38 WatchSource:0}: Error finding container fcbe2b12815d30f99729475fefda5acf268da8333a750ef5ca62526337516f38: Status 404 returned error can't find the container with id fcbe2b12815d30f99729475fefda5acf268da8333a750ef5ca62526337516f38 Apr 16 16:48:17.808233 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.808150 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:17.874240 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:17.874215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:48:17.874917 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:48:17.874888 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18cab14_9b62_4b5f_bc05_be66c3a899df.slice/crio-0b23dbf18c4667de9c4eaf63e2762f1d6024c726171abb7fb50a9a430af8221f WatchSource:0}: Error finding container 0b23dbf18c4667de9c4eaf63e2762f1d6024c726171abb7fb50a9a430af8221f: Status 404 returned error can't find the container with id 0b23dbf18c4667de9c4eaf63e2762f1d6024c726171abb7fb50a9a430af8221f Apr 16 16:48:18.648427 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:18.648388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerStarted","Data":"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594"} Apr 16 16:48:18.648427 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:18.648437 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerStarted","Data":"fcbe2b12815d30f99729475fefda5acf268da8333a750ef5ca62526337516f38"} Apr 16 16:48:18.649806 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:18.649778 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerStarted","Data":"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3"} Apr 16 16:48:18.649806 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:18.649811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerStarted","Data":"0b23dbf18c4667de9c4eaf63e2762f1d6024c726171abb7fb50a9a430af8221f"} Apr 16 16:48:19.655110 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:19.655073 2571 generic.go:358] "Generic (PLEG): container finished" podID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerID="bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3" exitCode=0 Apr 16 16:48:19.655617 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:19.655161 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerDied","Data":"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3"} Apr 16 16:48:20.662345 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:20.662302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerStarted","Data":"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22"} Apr 16 16:48:20.662345 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:20.662351 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerStarted","Data":"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4"} Apr 16 16:48:20.662968 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:20.662402 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:20.684521 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:20.684469 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" podStartSLOduration=3.68445135 podStartE2EDuration="3.68445135s" podCreationTimestamp="2026-04-16 16:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:20.682248222 +0000 UTC m=+1504.517101126" watchObservedRunningTime="2026-04-16 16:48:20.68445135 +0000 UTC m=+1504.519304298" Apr 16 16:48:22.672362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:22.672319 2571 generic.go:358] "Generic (PLEG): container finished" podID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerID="bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594" exitCode=0 Apr 16 16:48:22.672362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:22.672364 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerDied","Data":"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594"} Apr 16 16:48:27.730471 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:27.730436 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:27.730965 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:27.730497 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:27.733714 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:27.733663 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:28.711369 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:28.711337 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:49.718158 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:49.718103 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:48:49.809770 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:49.809726 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerStarted","Data":"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7"} Apr 16 16:48:49.835868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:48:49.835805 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.1429905 podStartE2EDuration="32.835789644s" podCreationTimestamp="2026-04-16 16:48:17 +0000 UTC" firstStartedPulling="2026-04-16 16:48:22.673616383 +0000 UTC m=+1506.508469265" lastFinishedPulling="2026-04-16 16:48:49.366415522 +0000 UTC m=+1533.201268409" observedRunningTime="2026-04-16 16:48:49.831163756 +0000 UTC m=+1533.666016656" watchObservedRunningTime="2026-04-16 16:48:49.835789644 +0000 UTC m=+1533.670642548" Apr 16 16:51:13.987014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:13.986979 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:51:13.987576 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:13.987295 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="main" containerID="cri-o://eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" gracePeriod=30 Apr 16 16:51:13.987576 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:13.987339 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="tokenizer" containerID="cri-o://cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" gracePeriod=30 Apr 16 16:51:14.345287 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.345261 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:51:14.353750 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.353726 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location\") pod \"c18cab14-9b62-4b5f-bc05-be66c3a899df\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " Apr 16 16:51:14.353841 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.353767 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs\") pod \"c18cab14-9b62-4b5f-bc05-be66c3a899df\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " Apr 16 16:51:14.353841 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.353828 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds\") pod \"c18cab14-9b62-4b5f-bc05-be66c3a899df\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " Apr 16 16:51:14.353922 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.353865 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8n2c\" (UniqueName: \"kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c\") pod \"c18cab14-9b62-4b5f-bc05-be66c3a899df\" (UID: \"c18cab14-9b62-4b5f-bc05-be66c3a899df\") " Apr 16 16:51:14.354169 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.354140 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c18cab14-9b62-4b5f-bc05-be66c3a899df" (UID: "c18cab14-9b62-4b5f-bc05-be66c3a899df"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:14.354464 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.354431 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c18cab14-9b62-4b5f-bc05-be66c3a899df" (UID: "c18cab14-9b62-4b5f-bc05-be66c3a899df"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:14.355939 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.355909 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c18cab14-9b62-4b5f-bc05-be66c3a899df" (UID: "c18cab14-9b62-4b5f-bc05-be66c3a899df"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:14.356034 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.355976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c" (OuterVolumeSpecName: "kube-api-access-v8n2c") pod "c18cab14-9b62-4b5f-bc05-be66c3a899df" (UID: "c18cab14-9b62-4b5f-bc05-be66c3a899df"). InnerVolumeSpecName "kube-api-access-v8n2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:14.412963 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.412930 2571 generic.go:358] "Generic (PLEG): container finished" podID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerID="cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" exitCode=0 Apr 16 16:51:14.412963 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.412955 2571 generic.go:358] "Generic (PLEG): container finished" podID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerID="eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" exitCode=0 Apr 16 16:51:14.413225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.412999 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerDied","Data":"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22"} Apr 16 16:51:14.413225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.413005 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" Apr 16 16:51:14.413225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.413042 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerDied","Data":"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4"} Apr 16 16:51:14.413225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.413053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc" event={"ID":"c18cab14-9b62-4b5f-bc05-be66c3a899df","Type":"ContainerDied","Data":"0b23dbf18c4667de9c4eaf63e2762f1d6024c726171abb7fb50a9a430af8221f"} Apr 16 16:51:14.413225 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.413067 2571 scope.go:117] "RemoveContainer" containerID="cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" Apr 16 16:51:14.422834 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.422811 2571 scope.go:117] "RemoveContainer" containerID="eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" Apr 16 16:51:14.431081 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.431062 2571 scope.go:117] "RemoveContainer" containerID="bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3" Apr 16 16:51:14.437312 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.437286 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:51:14.441150 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.441108 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-scheffvmc"] Apr 16 16:51:14.441445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.441431 2571 scope.go:117] "RemoveContainer" containerID="cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" Apr 16 16:51:14.441751 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:14.441721 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22\": container with ID starting with cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22 not found: ID does not exist" containerID="cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" Apr 16 16:51:14.441837 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.441759 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22"} err="failed to get container status \"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22\": rpc error: code = NotFound desc = could not find container \"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22\": container with ID starting with cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22 not found: ID does not exist" Apr 16 16:51:14.441837 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.441784 2571 scope.go:117] "RemoveContainer" containerID="eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" Apr 16 16:51:14.442043 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:14.442024 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4\": container with ID starting with eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4 not found: ID does not exist" containerID="eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" Apr 16 16:51:14.442089 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442049 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4"} err="failed to get container status \"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4\": rpc error: code = NotFound desc = could not find container \"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4\": container with ID starting with eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4 not found: ID does not exist" Apr 16 16:51:14.442089 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442064 2571 scope.go:117] "RemoveContainer" containerID="bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3" Apr 16 16:51:14.442314 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:14.442297 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3\": container with ID starting with bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3 not found: ID does not exist" containerID="bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3" Apr 16 16:51:14.442370 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442317 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3"} err="failed to get container status \"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3\": rpc error: code = NotFound desc = could not find container \"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3\": container with ID starting with bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3 not found: ID does not exist" Apr 16 16:51:14.442370 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442330 2571 scope.go:117] "RemoveContainer" containerID="cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22" Apr 16 16:51:14.442548 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442528 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22"} err="failed to get container status \"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22\": rpc error: code = NotFound desc = could not find container \"cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22\": container with ID starting with cb9a38c90b382c9851dc02dfee07d524c996b56d839ba91558c29a880cd32f22 not found: ID does not exist" Apr 16 16:51:14.442604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442550 2571 scope.go:117] "RemoveContainer" containerID="eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4" Apr 16 16:51:14.442741 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442726 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4"} err="failed to get container status \"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4\": rpc error: code = NotFound desc = could not find container \"eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4\": container with ID starting with eea425a7873fe3e842f6b7e4b157f89e9f6116e17137539ab9dbdfdd0070f6d4 not found: ID does not exist" Apr 16 16:51:14.442784 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442743 2571 scope.go:117] "RemoveContainer" containerID="bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3" Apr 16 16:51:14.442913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.442897 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3"} err="failed to get container status \"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3\": rpc error: code = NotFound desc = could not find container \"bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3\": container with ID starting with bceb1c2c5f38ebee0c5e144f6ca5d9a54eeed52ab8ce8b5b31ab0154c31a25b3 not found: ID does not exist" Apr 16 16:51:14.454955 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.454929 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:14.454955 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.454956 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v8n2c\" (UniqueName: \"kubernetes.io/projected/c18cab14-9b62-4b5f-bc05-be66c3a899df-kube-api-access-v8n2c\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:14.455087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.454966 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c18cab14-9b62-4b5f-bc05-be66c3a899df-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:14.455087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.454975 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c18cab14-9b62-4b5f-bc05-be66c3a899df-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:14.756895 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:14.756861 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" path="/var/lib/kubelet/pods/c18cab14-9b62-4b5f-bc05-be66c3a899df/volumes" Apr 16 16:51:15.790322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:15.790283 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:51:15.790764 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:15.790551 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="main" containerID="cri-o://7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7" gracePeriod=30 Apr 16 16:51:16.569270 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.569245 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:51:16.673780 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673752 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.673942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673804 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.673942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673851 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.673942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673878 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbknh\" (UniqueName: \"kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.673942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673907 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.674177 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.673957 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache\") pod \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\" (UID: \"fffd00d2-3be9-4fde-b268-871b48f1bcf1\") " Apr 16 16:51:16.674337 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.674306 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home" (OuterVolumeSpecName: "home") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:16.674467 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.674436 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache" (OuterVolumeSpecName: "model-cache") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:16.676087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.676049 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm" (OuterVolumeSpecName: "dshm") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:16.676087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.676060 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh" (OuterVolumeSpecName: "kube-api-access-kbknh") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "kube-api-access-kbknh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:16.676261 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.676141 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:16.729349 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.729294 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fffd00d2-3be9-4fde-b268-871b48f1bcf1" (UID: "fffd00d2-3be9-4fde-b268-871b48f1bcf1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:16.775001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.774972 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:16.775001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.775000 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kbknh\" (UniqueName: \"kubernetes.io/projected/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kube-api-access-kbknh\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:16.775214 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.775015 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:16.775214 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.775029 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:16.775214 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.775041 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fffd00d2-3be9-4fde-b268-871b48f1bcf1-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:16.775214 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:16.775149 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fffd00d2-3be9-4fde-b268-871b48f1bcf1-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:17.431962 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.431924 2571 generic.go:358] "Generic (PLEG): container finished" podID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerID="7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7" exitCode=0 Apr 16 16:51:17.432391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.431988 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 16:51:17.432391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.432015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerDied","Data":"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7"} Apr 16 16:51:17.432391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.432053 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"fffd00d2-3be9-4fde-b268-871b48f1bcf1","Type":"ContainerDied","Data":"fcbe2b12815d30f99729475fefda5acf268da8333a750ef5ca62526337516f38"} Apr 16 16:51:17.432391 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.432071 2571 scope.go:117] "RemoveContainer" containerID="7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7" Apr 16 16:51:17.450015 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.449980 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:51:17.451639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.451617 2571 scope.go:117] "RemoveContainer" containerID="bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594" Apr 16 16:51:17.454138 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.454102 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 16:51:17.517751 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.517724 2571 scope.go:117] "RemoveContainer" containerID="7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7" Apr 16 16:51:17.518074 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:17.518055 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7\": container with ID starting with 7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7 not found: ID does not exist" containerID="7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7" Apr 16 16:51:17.518190 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.518083 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7"} err="failed to get container status \"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7\": rpc error: code = NotFound desc = could not find container \"7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7\": container with ID starting with 7a3ba4891f3477c6e4e984cf6a4ee4553a081b46d9db26f3e8c5ad94a6a9b2f7 not found: ID does not exist" Apr 16 16:51:17.518190 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.518104 2571 scope.go:117] "RemoveContainer" containerID="bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594" Apr 16 16:51:17.518409 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:17.518382 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594\": container with ID starting with bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594 not found: ID does not exist" containerID="bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594" Apr 16 16:51:17.518455 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:17.518416 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594"} err="failed to get container status \"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594\": rpc error: code = NotFound desc = could not find container \"bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594\": container with ID starting with bfbdc02c4390cf24265d2557fdde3dd87059f72594685e36b9b3b304de05d594 not found: ID does not exist" Apr 16 16:51:18.755581 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:18.755546 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" path="/var/lib/kubelet/pods/fffd00d2-3be9-4fde-b268-871b48f1bcf1/volumes" Apr 16 16:51:24.043867 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.043826 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044224 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="storage-initializer" Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044236 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="storage-initializer" Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044247 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="main" Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044253 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="main" Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044262 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="storage-initializer" Apr 16 16:51:24.044267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044268 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="storage-initializer" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044282 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="tokenizer" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044287 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="tokenizer" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044293 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="main" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044298 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="main" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044361 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="main" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044372 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="c18cab14-9b62-4b5f-bc05-be66c3a899df" containerName="tokenizer" Apr 16 16:51:24.044516 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.044380 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fffd00d2-3be9-4fde-b268-871b48f1bcf1" containerName="main" Apr 16 16:51:24.049658 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.049637 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.052198 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.052171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:51:24.052198 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.052187 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 16:51:24.052198 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.052191 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:51:24.053230 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.053164 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-xlz7x\"" Apr 16 16:51:24.056540 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.056517 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:24.142712 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.142679 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.142868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.142721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkfd\" (UniqueName: \"kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.142868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.142775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.142962 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.142857 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244250 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244214 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244422 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244295 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244422 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244330 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkfd\" (UniqueName: \"kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244422 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244719 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244696 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.244757 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.244712 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.246854 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.246830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.252642 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.252612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkfd\" (UniqueName: \"kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd\") pod \"scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.359847 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.359745 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:24.504498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:24.504456 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:24.509804 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:51:24.509772 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11a67968_d706_46ce_bbed_de1e92a54309.slice/crio-9c0085b1773db126e184728879ac4743788c9bd93003ed5b704628fee9140024 WatchSource:0}: Error finding container 9c0085b1773db126e184728879ac4743788c9bd93003ed5b704628fee9140024: Status 404 returned error can't find the container with id 9c0085b1773db126e184728879ac4743788c9bd93003ed5b704628fee9140024 Apr 16 16:51:25.465635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:25.465602 2571 generic.go:358] "Generic (PLEG): container finished" podID="11a67968-d706-46ce-bbed-de1e92a54309" containerID="e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84" exitCode=0 Apr 16 16:51:25.466042 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:25.465702 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerDied","Data":"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84"} Apr 16 16:51:25.466042 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:25.465746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerStarted","Data":"9c0085b1773db126e184728879ac4743788c9bd93003ed5b704628fee9140024"} Apr 16 16:51:26.471933 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:26.471888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerStarted","Data":"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166"} Apr 16 16:51:26.471933 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:26.471933 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerStarted","Data":"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574"} Apr 16 16:51:26.472472 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:26.471948 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:26.496011 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:26.495960 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" podStartSLOduration=2.495946855 podStartE2EDuration="2.495946855s" podCreationTimestamp="2026-04-16 16:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:51:26.49251512 +0000 UTC m=+1690.327368025" watchObservedRunningTime="2026-04-16 16:51:26.495946855 +0000 UTC m=+1690.330799759" Apr 16 16:51:34.360303 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:34.360261 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:34.360303 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:34.360302 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:34.362976 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:34.362947 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:34.504508 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:34.504473 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:55.509331 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:55.509255 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:56.652765 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:56.652716 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:56.653419 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:56.653372 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="main" containerID="cri-o://254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" gracePeriod=30 Apr 16 16:51:56.653419 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:56.653407 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="tokenizer" containerID="cri-o://3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" gracePeriod=30 Apr 16 16:51:57.003664 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.003636 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:57.155319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.155288 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs\") pod \"11a67968-d706-46ce-bbed-de1e92a54309\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " Apr 16 16:51:57.155502 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.155355 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location\") pod \"11a67968-d706-46ce-bbed-de1e92a54309\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " Apr 16 16:51:57.155502 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.155396 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhkfd\" (UniqueName: \"kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd\") pod \"11a67968-d706-46ce-bbed-de1e92a54309\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " Apr 16 16:51:57.155502 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.155466 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds\") pod \"11a67968-d706-46ce-bbed-de1e92a54309\" (UID: \"11a67968-d706-46ce-bbed-de1e92a54309\") " Apr 16 16:51:57.155805 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.155778 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "11a67968-d706-46ce-bbed-de1e92a54309" (UID: "11a67968-d706-46ce-bbed-de1e92a54309"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.156155 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.156100 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "11a67968-d706-46ce-bbed-de1e92a54309" (UID: "11a67968-d706-46ce-bbed-de1e92a54309"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:57.157423 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.157399 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "11a67968-d706-46ce-bbed-de1e92a54309" (UID: "11a67968-d706-46ce-bbed-de1e92a54309"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:57.157556 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.157533 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd" (OuterVolumeSpecName: "kube-api-access-jhkfd") pod "11a67968-d706-46ce-bbed-de1e92a54309" (UID: "11a67968-d706-46ce-bbed-de1e92a54309"). InnerVolumeSpecName "kube-api-access-jhkfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:57.256911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.256825 2571 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-tokenizer-uds\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.256911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.256856 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11a67968-d706-46ce-bbed-de1e92a54309-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.256911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.256868 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11a67968-d706-46ce-bbed-de1e92a54309-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.256911 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.256880 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhkfd\" (UniqueName: \"kubernetes.io/projected/11a67968-d706-46ce-bbed-de1e92a54309-kube-api-access-jhkfd\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:51:57.306411 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7"] Apr 16 16:51:57.306769 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306754 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="main" Apr 16 16:51:57.306769 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306769 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="main" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306781 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="storage-initializer" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306788 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="storage-initializer" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306805 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="tokenizer" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306810 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="tokenizer" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306868 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="main" Apr 16 16:51:57.306901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.306877 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="11a67968-d706-46ce-bbed-de1e92a54309" containerName="tokenizer" Apr 16 16:51:57.320538 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.320511 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.323273 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.323248 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-l6cr9\"" Apr 16 16:51:57.323527 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.323512 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 16:51:57.330515 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.330484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7"] Apr 16 16:51:57.458476 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71934c38-1068-4b27-8623-52057ef5f6b8-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458666 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458649 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69vv\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-kube-api-access-b69vv\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71934c38-1068-4b27-8623-52057ef5f6b8-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.458869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.458779 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.559972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.559881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.559972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.559950 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71934c38-1068-4b27-8623-52057ef5f6b8-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560227 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.559993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560227 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560016 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560227 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560159 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560227 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560233 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b69vv\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-kube-api-access-b69vv\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560270 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560426 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560306 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71934c38-1068-4b27-8623-52057ef5f6b8-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560463 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560529 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560729 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560587 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.560949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.560930 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/71934c38-1068-4b27-8623-52057ef5f6b8-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.562439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.562414 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/71934c38-1068-4b27-8623-52057ef5f6b8-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.562638 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.562618 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/71934c38-1068-4b27-8623-52057ef5f6b8-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.570584 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.570556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.571220 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.571200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69vv\" (UniqueName: \"kubernetes.io/projected/71934c38-1068-4b27-8623-52057ef5f6b8-kube-api-access-b69vv\") pod \"router-gateway-2-openshift-default-6866b85949-4p7m7\" (UID: \"71934c38-1068-4b27-8623-52057ef5f6b8\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.600604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600548 2571 generic.go:358] "Generic (PLEG): container finished" podID="11a67968-d706-46ce-bbed-de1e92a54309" containerID="3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" exitCode=0 Apr 16 16:51:57.600604 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600598 2571 generic.go:358] "Generic (PLEG): container finished" podID="11a67968-d706-46ce-bbed-de1e92a54309" containerID="254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" exitCode=0 Apr 16 16:51:57.600853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerDied","Data":"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166"} Apr 16 16:51:57.600853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600768 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerDied","Data":"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574"} Apr 16 16:51:57.600853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" event={"ID":"11a67968-d706-46ce-bbed-de1e92a54309","Type":"ContainerDied","Data":"9c0085b1773db126e184728879ac4743788c9bd93003ed5b704628fee9140024"} Apr 16 16:51:57.600853 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.600814 2571 scope.go:117] "RemoveContainer" containerID="3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" Apr 16 16:51:57.601094 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.601075 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7" Apr 16 16:51:57.613939 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.613917 2571 scope.go:117] "RemoveContainer" containerID="254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" Apr 16 16:51:57.625327 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.625304 2571 scope.go:117] "RemoveContainer" containerID="e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84" Apr 16 16:51:57.629041 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.629013 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:57.632230 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.632206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:51:57.633766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.633744 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-7599bzdpq7"] Apr 16 16:51:57.634552 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.634529 2571 scope.go:117] "RemoveContainer" containerID="3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" Apr 16 16:51:57.634850 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:57.634821 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166\": container with ID starting with 3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166 not found: ID does not exist" containerID="3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" Apr 16 16:51:57.634948 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.634858 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166"} err="failed to get container status \"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166\": rpc error: code = NotFound desc = could not find container \"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166\": container with ID starting with 3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166 not found: ID does not exist" Apr 16 16:51:57.634948 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.634881 2571 scope.go:117] "RemoveContainer" containerID="254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" Apr 16 16:51:57.635182 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:57.635159 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574\": container with ID starting with 254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574 not found: ID does not exist" containerID="254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" Apr 16 16:51:57.635237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635192 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574"} err="failed to get container status \"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574\": rpc error: code = NotFound desc = could not find container \"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574\": container with ID starting with 254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574 not found: ID does not exist" Apr 16 16:51:57.635237 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635214 2571 scope.go:117] "RemoveContainer" containerID="e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84" Apr 16 16:51:57.635480 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:51:57.635457 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84\": container with ID starting with e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84 not found: ID does not exist" containerID="e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84" Apr 16 16:51:57.635558 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635485 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84"} err="failed to get container status \"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84\": rpc error: code = NotFound desc = could not find container \"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84\": container with ID starting with e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84 not found: ID does not exist" Apr 16 16:51:57.635558 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635500 2571 scope.go:117] "RemoveContainer" containerID="3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166" Apr 16 16:51:57.635725 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635709 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166"} err="failed to get container status \"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166\": rpc error: code = NotFound desc = could not find container \"3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166\": container with ID starting with 3de0610dada105961672da0eb335f8bb15c6af3e4783a276a3a58658d26e8166 not found: ID does not exist" Apr 16 16:51:57.635767 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635726 2571 scope.go:117] "RemoveContainer" containerID="254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574" Apr 16 16:51:57.635949 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635934 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574"} err="failed to get container status \"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574\": rpc error: code = NotFound desc = could not find container \"254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574\": container with ID starting with 254d3eedc65c94efda051e73b8e4b29666bf77c8f1d540432f1e10a6d2494574 not found: ID does not exist" Apr 16 16:51:57.635996 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.635950 2571 scope.go:117] "RemoveContainer" containerID="e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84" Apr 16 16:51:57.636155 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.636132 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84"} err="failed to get container status \"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84\": rpc error: code = NotFound desc = could not find container \"e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84\": container with ID starting with e6ec013b811ad401652f2fb1137e36721571c02e8cda3cad26862b6107374d84 not found: ID does not exist" Apr 16 16:51:57.773105 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:57.773074 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7"] Apr 16 16:51:57.774820 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:51:57.774794 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71934c38_1068_4b27_8623_52057ef5f6b8.slice/crio-635b3c9cdc9bd5fe6ec7b83b2da0b61875217bca978f08a961b60433b03ee0c8 WatchSource:0}: Error finding container 635b3c9cdc9bd5fe6ec7b83b2da0b61875217bca978f08a961b60433b03ee0c8: Status 404 returned error can't find the container with id 635b3c9cdc9bd5fe6ec7b83b2da0b61875217bca978f08a961b60433b03ee0c8 Apr 16 16:51:58.607605 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:58.607564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" event={"ID":"71934c38-1068-4b27-8623-52057ef5f6b8","Type":"ContainerStarted","Data":"635b3c9cdc9bd5fe6ec7b83b2da0b61875217bca978f08a961b60433b03ee0c8"} Apr 16 16:51:58.758677 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:51:58.758640 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a67968-d706-46ce-bbed-de1e92a54309" path="/var/lib/kubelet/pods/11a67968-d706-46ce-bbed-de1e92a54309/volumes" Apr 16 16:52:00.237269 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.237228 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:52:00.237660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.237320 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:52:00.237660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.237372 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 16:52:00.618573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.618480 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" event={"ID":"71934c38-1068-4b27-8623-52057ef5f6b8","Type":"ContainerStarted","Data":"5c316e6c375072bc1eb5184af0e04e997b5ea2749dd4bf9ea517e8e604ea4526"} Apr 16 16:52:00.632708 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.632677 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:52:00.643614 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:00.643541 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" podStartSLOduration=1.183335583 podStartE2EDuration="3.643524436s" podCreationTimestamp="2026-04-16 16:51:57 +0000 UTC" firstStartedPulling="2026-04-16 16:51:57.77671881 +0000 UTC m=+1721.611571694" lastFinishedPulling="2026-04-16 16:52:00.236907653 +0000 UTC m=+1724.071760547" observedRunningTime="2026-04-16 16:52:00.640011631 +0000 UTC m=+1724.474864575" watchObservedRunningTime="2026-04-16 16:52:00.643524436 +0000 UTC m=+1724.478377340" Apr 16 16:52:01.634271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:01.634230 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" podUID="71934c38-1068-4b27-8623-52057ef5f6b8" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.60:15021/healthz/ready\": dial tcp 10.134.0.60:15021: connect: connection refused" Apr 16 16:52:02.633330 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:02.633293 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" podUID="71934c38-1068-4b27-8623-52057ef5f6b8" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.60:15021/healthz/ready\": dial tcp 10.134.0.60:15021: connect: connection refused" Apr 16 16:52:03.636721 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:03.636689 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:52:03.637138 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:03.637077 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:52:03.637740 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:03.637724 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-4p7m7" Apr 16 16:52:13.587947 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.587913 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:52:13.592441 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.592425 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.594990 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.594961 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 16:52:13.595159 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.595005 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-n8zq8\"" Apr 16 16:52:13.605519 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.605484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:52:13.672970 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.672939 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:52:13.677260 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.677238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.693173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.693110 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-4ddlh\"" Apr 16 16:52:13.704425 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.704396 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:52:13.721822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.721822 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721822 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.722008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7m27\" (UniqueName: \"kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.722008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721915 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.722008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721943 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hprqs\" (UniqueName: \"kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.722008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.721968 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.722172 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.722172 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.722172 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722099 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.722172 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722130 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.722314 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722180 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.722314 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.722210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.823678 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823719 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823736 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7m27\" (UniqueName: \"kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hprqs\" (UniqueName: \"kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.823882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.823837 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.824319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824180 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824260 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.824319 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.824551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824551 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.824711 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824766 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.824821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824771 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.824903 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.824878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.826308 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.826285 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.826437 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.826295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.826574 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.826556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.826796 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.826777 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.832846 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.832818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hprqs\" (UniqueName: \"kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs\") pod \"router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:13.833806 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.833778 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7m27\" (UniqueName: \"kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27\") pod \"router-with-refs-pd-test-kserve-5557b45666-skg9c\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.902534 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.902435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:13.987639 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:13.987352 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:14.053009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:14.052966 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:52:14.075458 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:52:14.075425 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95ee401_7225_4eeb_99c6_b6879e724493.slice/crio-d4879f93cc09401336985b58e89b206a8d938bdd58b30a14a3467a9e5ee21295 WatchSource:0}: Error finding container d4879f93cc09401336985b58e89b206a8d938bdd58b30a14a3467a9e5ee21295: Status 404 returned error can't find the container with id d4879f93cc09401336985b58e89b206a8d938bdd58b30a14a3467a9e5ee21295 Apr 16 16:52:14.159672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:14.159639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:52:14.161415 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:52:14.161385 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b30d8ef_2b9e_4771_8954_1a186e65e310.slice/crio-223b749362bc934545fee48090d24110c41931a7cbb6d89b0b9fabda56739c54 WatchSource:0}: Error finding container 223b749362bc934545fee48090d24110c41931a7cbb6d89b0b9fabda56739c54: Status 404 returned error can't find the container with id 223b749362bc934545fee48090d24110c41931a7cbb6d89b0b9fabda56739c54 Apr 16 16:52:14.681546 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:14.681469 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerStarted","Data":"d4879f93cc09401336985b58e89b206a8d938bdd58b30a14a3467a9e5ee21295"} Apr 16 16:52:14.685855 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:14.685811 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerStarted","Data":"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704"} Apr 16 16:52:14.686041 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:14.685868 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerStarted","Data":"223b749362bc934545fee48090d24110c41931a7cbb6d89b0b9fabda56739c54"} Apr 16 16:52:15.692144 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:15.692015 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerStarted","Data":"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075"} Apr 16 16:52:15.692631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:15.692193 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:16.700024 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:16.699976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerStarted","Data":"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837"} Apr 16 16:52:18.711260 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:18.711226 2571 generic.go:358] "Generic (PLEG): container finished" podID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerID="6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704" exitCode=0 Apr 16 16:52:18.711635 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:18.711295 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerDied","Data":"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704"} Apr 16 16:52:19.718171 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:19.718135 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerStarted","Data":"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6"} Apr 16 16:52:19.745367 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:19.745301 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podStartSLOduration=6.745280097 podStartE2EDuration="6.745280097s" podCreationTimestamp="2026-04-16 16:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:19.742482817 +0000 UTC m=+1743.577335723" watchObservedRunningTime="2026-04-16 16:52:19.745280097 +0000 UTC m=+1743.580133002" Apr 16 16:52:20.724953 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:20.724918 2571 generic.go:358] "Generic (PLEG): container finished" podID="d95ee401-7225-4eeb-99c6-b6879e724493" containerID="5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837" exitCode=0 Apr 16 16:52:20.725497 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:20.724990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerDied","Data":"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837"} Apr 16 16:52:21.731957 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:21.731921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerStarted","Data":"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c"} Apr 16 16:52:21.765748 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:21.765691 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podStartSLOduration=7.636216531 podStartE2EDuration="8.765676142s" podCreationTimestamp="2026-04-16 16:52:13 +0000 UTC" firstStartedPulling="2026-04-16 16:52:14.077650388 +0000 UTC m=+1737.912503280" lastFinishedPulling="2026-04-16 16:52:15.207109995 +0000 UTC m=+1739.041962891" observedRunningTime="2026-04-16 16:52:21.76151213 +0000 UTC m=+1745.596365034" watchObservedRunningTime="2026-04-16 16:52:21.765676142 +0000 UTC m=+1745.600529049" Apr 16 16:52:23.903186 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.903147 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:23.903186 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.903196 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:23.904793 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.904759 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:52:23.988345 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.988306 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:23.988542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.988361 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:52:23.990063 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:23.990025 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:52:33.903397 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:33.903338 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:52:33.920903 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:33.920864 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:52:33.988637 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:33.988600 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:52:43.903034 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:43.902981 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:52:43.988421 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:43.988374 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:52:53.903607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:53.903554 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:52:53.988792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:52:53.988743 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:03.903763 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:03.903721 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:03.988514 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:03.988473 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:13.903640 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:13.903587 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:13.988313 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:13.988262 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:16.834967 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:16.834941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:53:16.838317 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:16.838292 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:53:16.838730 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:16.838711 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:53:16.842298 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:16.842277 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vm8pb_5c076e87-1778-44fa-9253-5a9e0c898f3b/ovn-acl-logging/0.log" Apr 16 16:53:23.903309 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:23.903250 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:23.988768 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:23.988716 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:33.903909 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:33.903856 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:33.987927 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:33.987888 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:43.903457 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:43.903398 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:43.988279 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:43.988233 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:53:53.903679 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:53.903617 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:53:53.988417 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:53:53.988369 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:54:03.903588 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:03.903527 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:54:03.987952 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:03.987910 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:54:13.903795 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:13.903748 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:54:13.988738 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:13.988691 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:54:23.903731 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:23.903677 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:54:23.987937 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:23.987890 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:54:33.903901 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:33.903852 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" probeResult="failure" output="Get \"https://10.134.0.61:8001/health\": dial tcp 10.134.0.61:8001: connect: connection refused" Apr 16 16:54:33.988267 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:33.988217 2571 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" probeResult="failure" output="Get \"https://10.134.0.62:8000/health\": dial tcp 10.134.0.62:8000: connect: connection refused" Apr 16 16:54:43.912622 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:43.912536 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:54:43.924485 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:43.924457 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:54:43.999009 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:43.998978 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:54:44.010029 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:44.009996 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:54:56.667625 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:56.667583 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:54:56.668230 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:56.667948 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" containerID="cri-o://d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6" gracePeriod=30 Apr 16 16:54:56.676631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:56.676603 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:54:56.676981 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:54:56.676930 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" containerID="cri-o://62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" gracePeriod=30 Apr 16 16:55:12.060599 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.060566 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:12.097748 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.097717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:12.105271 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.105235 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:12.117899 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.117876 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:12.136076 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.136048 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:12.145490 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:12.145469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:13.125886 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.125861 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:13.150748 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.150716 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:13.158126 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.158087 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:13.169980 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.169947 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:13.188965 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.188937 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:13.197854 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:13.197829 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:14.252726 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.252700 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:14.274630 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.274602 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:14.281865 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.281840 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:14.292849 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.292825 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:14.310810 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.310788 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:14.320213 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:14.320192 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:15.272439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.272404 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:15.295942 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.295913 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:15.302631 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.302606 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:15.313416 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.313393 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:15.330095 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.330070 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:15.338039 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:15.338017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:16.309913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.309882 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:16.331947 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.331914 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:16.340197 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.340165 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:16.352192 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.352167 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:16.371014 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.370987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:16.379971 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:16.379949 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:17.338371 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.338340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:17.362907 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.362874 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:17.371842 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.371813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:17.382022 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.381997 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:17.400188 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.400164 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:17.408730 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:17.408709 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:18.383924 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.383885 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:18.406759 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.406718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:18.413720 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.413689 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:18.425447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.425424 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:18.442997 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.442971 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:18.451989 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:18.451969 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:19.415748 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.415720 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:19.438444 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.438417 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:19.449867 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.449843 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:19.462535 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.462505 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:19.483517 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.483489 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:19.492499 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:19.492473 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:20.653660 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.653634 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:20.675619 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.675592 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:20.683567 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.683544 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:20.694234 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.694209 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:20.712129 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.712102 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:20.722419 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:20.722394 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:21.692082 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.692051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:21.713096 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.713063 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:21.724564 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.724540 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:21.735815 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.735787 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:21.755360 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.755337 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:21.764696 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:21.764672 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:22.722555 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.722528 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:22.745601 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.745569 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:22.753134 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.753096 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:22.764508 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.764484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:22.782475 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.782445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:22.792423 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:22.792387 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:23.756896 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.756870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:23.779230 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.779203 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:23.785864 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.785837 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:23.797190 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.797170 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:23.816692 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.816665 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:23.825672 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:23.825647 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:24.781102 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.781077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:24.802827 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.802802 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:24.811084 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.810986 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:24.821445 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.821423 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:24.839917 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.839896 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:24.851313 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:24.851290 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:25.791857 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.791831 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-4p7m7_71934c38-1068-4b27-8623-52057ef5f6b8/istio-proxy/0.log" Apr 16 16:55:25.812335 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.812303 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:25.818552 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.818527 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/llm-d-routing-sidecar/0.log" Apr 16 16:55:25.828362 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.828335 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/storage-initializer/0.log" Apr 16 16:55:25.845685 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.845657 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/main/0.log" Apr 16 16:55:25.854320 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:25.854288 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5_1b30d8ef-2b9e-4771-8954-1a186e65e310/storage-initializer/0.log" Apr 16 16:55:26.677573 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:26.677522 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="llm-d-routing-sidecar" containerID="cri-o://3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" gracePeriod=2 Apr 16 16:55:26.914913 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:26.914886 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db657d5cd-98k8d_23b07769-d51e-41eb-bd71-9c987916dbd8/router/0.log" Apr 16 16:55:26.934414 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:26.934352 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:55:26.952607 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:26.952585 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:26.953296 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:26.953278 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:55:27.001404 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001376 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7m27\" (UniqueName: \"kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.001404 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001412 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001430 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001463 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001488 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001509 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001540 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001607 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.001659 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001636 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.002021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001667 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.002021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001698 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hprqs\" (UniqueName: \"kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs\") pod \"1b30d8ef-2b9e-4771-8954-1a186e65e310\" (UID: \"1b30d8ef-2b9e-4771-8954-1a186e65e310\") " Apr 16 16:55:27.002021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001738 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location\") pod \"d95ee401-7225-4eeb-99c6-b6879e724493\" (UID: \"d95ee401-7225-4eeb-99c6-b6879e724493\") " Apr 16 16:55:27.002021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001731 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache" (OuterVolumeSpecName: "model-cache") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.002021 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.001965 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home" (OuterVolumeSpecName: "home") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.002306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.002198 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.002306 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.002220 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.002419 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.002339 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home" (OuterVolumeSpecName: "home") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.002476 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.002448 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache" (OuterVolumeSpecName: "model-cache") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.004164 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.004109 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:55:27.004304 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.004213 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:55:27.004372 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.004355 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm" (OuterVolumeSpecName: "dshm") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.004447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.004422 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm" (OuterVolumeSpecName: "dshm") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.004447 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.004426 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27" (OuterVolumeSpecName: "kube-api-access-b7m27") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "kube-api-access-b7m27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:27.005051 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.005025 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs" (OuterVolumeSpecName: "kube-api-access-hprqs") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "kube-api-access-hprqs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:55:27.052168 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.052100 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d95ee401-7225-4eeb-99c6-b6879e724493" (UID: "d95ee401-7225-4eeb-99c6-b6879e724493"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.059064 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.059035 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1b30d8ef-2b9e-4771-8954-1a186e65e310" (UID: "1b30d8ef-2b9e-4771-8954-1a186e65e310"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:27.103630 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103590 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hprqs\" (UniqueName: \"kubernetes.io/projected/1b30d8ef-2b9e-4771-8954-1a186e65e310-kube-api-access-hprqs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103630 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103622 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103630 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103631 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7m27\" (UniqueName: \"kubernetes.io/projected/d95ee401-7225-4eeb-99c6-b6879e724493-kube-api-access-b7m27\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103642 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1b30d8ef-2b9e-4771-8954-1a186e65e310-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103653 2571 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d95ee401-7225-4eeb-99c6-b6879e724493-tls-certs\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103661 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103669 2571 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-model-cache\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103677 2571 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-dshm\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103686 2571 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1b30d8ef-2b9e-4771-8954-1a186e65e310-kserve-provision-location\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.103887 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.103695 2571 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d95ee401-7225-4eeb-99c6-b6879e724493-home\") on node \"ip-10-0-138-125.ec2.internal\" DevicePath \"\"" Apr 16 16:55:27.537246 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537218 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-5557b45666-skg9c_d95ee401-7225-4eeb-99c6-b6879e724493/main/0.log" Apr 16 16:55:27.537816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537786 2571 generic.go:358] "Generic (PLEG): container finished" podID="d95ee401-7225-4eeb-99c6-b6879e724493" containerID="62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" exitCode=137 Apr 16 16:55:27.537816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537815 2571 generic.go:358] "Generic (PLEG): container finished" podID="d95ee401-7225-4eeb-99c6-b6879e724493" containerID="3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" exitCode=0 Apr 16 16:55:27.537988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537865 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerDied","Data":"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c"} Apr 16 16:55:27.537988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537880 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" Apr 16 16:55:27.537988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537902 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerDied","Data":"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075"} Apr 16 16:55:27.537988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c" event={"ID":"d95ee401-7225-4eeb-99c6-b6879e724493","Type":"ContainerDied","Data":"d4879f93cc09401336985b58e89b206a8d938bdd58b30a14a3467a9e5ee21295"} Apr 16 16:55:27.537988 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.537927 2571 scope.go:117] "RemoveContainer" containerID="62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" Apr 16 16:55:27.539619 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.539587 2571 generic.go:358] "Generic (PLEG): container finished" podID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerID="d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6" exitCode=137 Apr 16 16:55:27.539728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.539632 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerDied","Data":"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6"} Apr 16 16:55:27.539728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.539669 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" event={"ID":"1b30d8ef-2b9e-4771-8954-1a186e65e310","Type":"ContainerDied","Data":"223b749362bc934545fee48090d24110c41931a7cbb6d89b0b9fabda56739c54"} Apr 16 16:55:27.539728 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.539674 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5" Apr 16 16:55:27.564665 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.564051 2571 scope.go:117] "RemoveContainer" containerID="5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837" Apr 16 16:55:27.564665 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.564616 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:55:27.567569 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.567544 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-5557b45666-skg9c"] Apr 16 16:55:27.580737 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.580707 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:55:27.584073 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.584048 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-588f9994fc-pmxz5"] Apr 16 16:55:27.615450 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.615427 2571 scope.go:117] "RemoveContainer" containerID="3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" Apr 16 16:55:27.623900 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.623878 2571 scope.go:117] "RemoveContainer" containerID="62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" Apr 16 16:55:27.624208 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:55:27.624180 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c\": container with ID starting with 62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c not found: ID does not exist" containerID="62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" Apr 16 16:55:27.624305 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624213 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c"} err="failed to get container status \"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c\": rpc error: code = NotFound desc = could not find container \"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c\": container with ID starting with 62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c not found: ID does not exist" Apr 16 16:55:27.624305 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624234 2571 scope.go:117] "RemoveContainer" containerID="5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837" Apr 16 16:55:27.624500 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:55:27.624481 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837\": container with ID starting with 5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837 not found: ID does not exist" containerID="5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837" Apr 16 16:55:27.624542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624511 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837"} err="failed to get container status \"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837\": rpc error: code = NotFound desc = could not find container \"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837\": container with ID starting with 5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837 not found: ID does not exist" Apr 16 16:55:27.624542 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624532 2571 scope.go:117] "RemoveContainer" containerID="3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" Apr 16 16:55:27.624768 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:55:27.624752 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075\": container with ID starting with 3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075 not found: ID does not exist" containerID="3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" Apr 16 16:55:27.624812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624772 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075"} err="failed to get container status \"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075\": rpc error: code = NotFound desc = could not find container \"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075\": container with ID starting with 3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075 not found: ID does not exist" Apr 16 16:55:27.624812 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.624791 2571 scope.go:117] "RemoveContainer" containerID="62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c" Apr 16 16:55:27.625032 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625013 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c"} err="failed to get container status \"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c\": rpc error: code = NotFound desc = could not find container \"62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c\": container with ID starting with 62334126169c1b6bc438076d48151b9bf39036ebd1caa556efdd405f8b2fb55c not found: ID does not exist" Apr 16 16:55:27.625098 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625034 2571 scope.go:117] "RemoveContainer" containerID="5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837" Apr 16 16:55:27.625276 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625255 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837"} err="failed to get container status \"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837\": rpc error: code = NotFound desc = could not find container \"5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837\": container with ID starting with 5f61e9d4b8f0d7e87bc7bcf09c5e9d22b07b2f1c6b34e3d3cc1aab7b3a5d6837 not found: ID does not exist" Apr 16 16:55:27.625347 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625277 2571 scope.go:117] "RemoveContainer" containerID="3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075" Apr 16 16:55:27.625512 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625495 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075"} err="failed to get container status \"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075\": rpc error: code = NotFound desc = could not find container \"3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075\": container with ID starting with 3024400afd4c3da5ce365c683d615d160945fed79bfc3f913df6fcd68e34e075 not found: ID does not exist" Apr 16 16:55:27.625552 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.625514 2571 scope.go:117] "RemoveContainer" containerID="d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6" Apr 16 16:55:27.645407 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.645382 2571 scope.go:117] "RemoveContainer" containerID="6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704" Apr 16 16:55:27.704909 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.704741 2571 scope.go:117] "RemoveContainer" containerID="d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6" Apr 16 16:55:27.705089 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:55:27.705067 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6\": container with ID starting with d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6 not found: ID does not exist" containerID="d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6" Apr 16 16:55:27.705173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.705102 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6"} err="failed to get container status \"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6\": rpc error: code = NotFound desc = could not find container \"d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6\": container with ID starting with d951ebc7af704f53174bf84410caba84ebb95142df8856aec7c2198d24945fd6 not found: ID does not exist" Apr 16 16:55:27.705173 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.705143 2571 scope.go:117] "RemoveContainer" containerID="6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704" Apr 16 16:55:27.705416 ip-10-0-138-125 kubenswrapper[2571]: E0416 16:55:27.705399 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704\": container with ID starting with 6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704 not found: ID does not exist" containerID="6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704" Apr 16 16:55:27.705467 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.705423 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704"} err="failed to get container status \"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704\": rpc error: code = NotFound desc = could not find container \"6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704\": container with ID starting with 6b1a1d5588f59c1a698217de744f245e9187f279ae8e1e4b51cbb51c5a39d704 not found: ID does not exist" Apr 16 16:55:27.777734 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:27.777690 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db657d5cd-98k8d_23b07769-d51e-41eb-bd71-9c987916dbd8/router/0.log" Apr 16 16:55:28.621430 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:28.621404 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ncp8v_ca68a685-0c86-4975-91b3-eb93ee8b65b6/kuadrant-console-plugin/0.log" Apr 16 16:55:28.755975 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:28.755945 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" path="/var/lib/kubelet/pods/1b30d8ef-2b9e-4771-8954-1a186e65e310/volumes" Apr 16 16:55:28.756409 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:28.756396 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" path="/var/lib/kubelet/pods/d95ee401-7225-4eeb-99c6-b6879e724493/volumes" Apr 16 16:55:31.049381 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049341 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfzkj/must-gather-p5vw4"] Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049764 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="storage-initializer" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049777 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="storage-initializer" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049794 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="llm-d-routing-sidecar" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049799 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="llm-d-routing-sidecar" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049811 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049817 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" Apr 16 16:55:31.049826 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049828 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="storage-initializer" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049834 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="storage-initializer" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049840 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049844 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049896 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="llm-d-routing-sidecar" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049907 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b30d8ef-2b9e-4771-8954-1a186e65e310" containerName="main" Apr 16 16:55:31.050049 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.049914 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="d95ee401-7225-4eeb-99c6-b6879e724493" containerName="main" Apr 16 16:55:31.054661 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.054642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.057074 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.057051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"kube-root-ca.crt\"" Apr 16 16:55:31.058100 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.058075 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pfzkj\"/\"openshift-service-ca.crt\"" Apr 16 16:55:31.058365 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.058344 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pfzkj\"/\"default-dockercfg-q75nm\"" Apr 16 16:55:31.061087 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.061060 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/must-gather-p5vw4"] Apr 16 16:55:31.140824 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.140784 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7500dd-28ce-4080-b3e4-cf127afcabff-must-gather-output\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.141001 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.140836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vsgt\" (UniqueName: \"kubernetes.io/projected/2b7500dd-28ce-4080-b3e4-cf127afcabff-kube-api-access-9vsgt\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.241643 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.241606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7500dd-28ce-4080-b3e4-cf127afcabff-must-gather-output\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.241869 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.241659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vsgt\" (UniqueName: \"kubernetes.io/projected/2b7500dd-28ce-4080-b3e4-cf127afcabff-kube-api-access-9vsgt\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.242028 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.242005 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2b7500dd-28ce-4080-b3e4-cf127afcabff-must-gather-output\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.250033 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.249999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vsgt\" (UniqueName: \"kubernetes.io/projected/2b7500dd-28ce-4080-b3e4-cf127afcabff-kube-api-access-9vsgt\") pod \"must-gather-p5vw4\" (UID: \"2b7500dd-28ce-4080-b3e4-cf127afcabff\") " pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.364816 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.364732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" Apr 16 16:55:31.500401 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.500372 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/must-gather-p5vw4"] Apr 16 16:55:31.501864 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:55:31.501835 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7500dd_28ce_4080_b3e4_cf127afcabff.slice/crio-0331d6df048d894369cbad957b6a593563f49dbc94e1002241029b904009ee56 WatchSource:0}: Error finding container 0331d6df048d894369cbad957b6a593563f49dbc94e1002241029b904009ee56: Status 404 returned error can't find the container with id 0331d6df048d894369cbad957b6a593563f49dbc94e1002241029b904009ee56 Apr 16 16:55:31.503581 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.503559 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:55:31.557464 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:31.557429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" event={"ID":"2b7500dd-28ce-4080-b3e4-cf127afcabff","Type":"ContainerStarted","Data":"0331d6df048d894369cbad957b6a593563f49dbc94e1002241029b904009ee56"} Apr 16 16:55:32.563667 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:32.563627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" event={"ID":"2b7500dd-28ce-4080-b3e4-cf127afcabff","Type":"ContainerStarted","Data":"92d5fda5315be2ee66beeefa39209a14dadc6c3681144e2292612133f0df213e"} Apr 16 16:55:32.564026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:32.563675 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" event={"ID":"2b7500dd-28ce-4080-b3e4-cf127afcabff","Type":"ContainerStarted","Data":"4f71ce04cc37bd2a9bc72d2b64c10d6462b686191a80c7f4692285ca4a2fd90c"} Apr 16 16:55:33.995868 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:33.995838 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-f7w8r_c466876b-30ba-4711-affc-092c2f5418b3/global-pull-secret-syncer/0.log" Apr 16 16:55:34.152593 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:34.152563 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p2krm_6fa936c5-e151-4f22-8ab2-c2bc28919d4b/konnectivity-agent/0.log" Apr 16 16:55:34.260311 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:34.260220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-125.ec2.internal_3daeaf9dae8edeea4bbaed1ffe567636/haproxy/0.log" Apr 16 16:55:38.018633 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:38.018596 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-ncp8v_ca68a685-0c86-4975-91b3-eb93ee8b65b6/kuadrant-console-plugin/0.log" Apr 16 16:55:39.282610 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.282572 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/alertmanager/0.log" Apr 16 16:55:39.309882 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.309850 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/config-reloader/0.log" Apr 16 16:55:39.335769 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.335739 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/kube-rbac-proxy-web/0.log" Apr 16 16:55:39.359709 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.359684 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/kube-rbac-proxy/0.log" Apr 16 16:55:39.384975 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.384946 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/kube-rbac-proxy-metric/0.log" Apr 16 16:55:39.411375 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.411338 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/prom-label-proxy/0.log" Apr 16 16:55:39.441146 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.441101 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ea169814-38bf-4113-95a2-44b8fef7b9b0/init-config-reloader/0.log" Apr 16 16:55:39.489908 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.489869 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-jl4xw_43a0dbe4-b395-4cbb-93b0-0918a13c59ec/cluster-monitoring-operator/0.log" Apr 16 16:55:39.606572 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.606484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-575cb7dcdc-lzr8k_c972e32f-ab16-4627-aca1-2b89acfd43f7/metrics-server/0.log" Apr 16 16:55:39.634432 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.634399 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-xt52t_615e1cf5-bc9c-4926-ab18-adea83d0889c/monitoring-plugin/0.log" Apr 16 16:55:39.744254 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.744225 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kmlg_5177c104-80c5-4f14-b607-d2d272ec4b4a/node-exporter/0.log" Apr 16 16:55:39.768751 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.768722 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kmlg_5177c104-80c5-4f14-b607-d2d272ec4b4a/kube-rbac-proxy/0.log" Apr 16 16:55:39.791054 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:39.791022 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7kmlg_5177c104-80c5-4f14-b607-d2d272ec4b4a/init-textfile/0.log" Apr 16 16:55:40.323008 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.322983 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56658d986b-vlstz_d6bc6542-5e72-45d7-90bd-5b1414a1404d/telemeter-client/0.log" Apr 16 16:55:40.354922 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.354886 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56658d986b-vlstz_d6bc6542-5e72-45d7-90bd-5b1414a1404d/reload/0.log" Apr 16 16:55:40.398627 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.398403 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56658d986b-vlstz_d6bc6542-5e72-45d7-90bd-5b1414a1404d/kube-rbac-proxy/0.log" Apr 16 16:55:40.446322 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.446285 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/thanos-query/0.log" Apr 16 16:55:40.503459 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.503424 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/kube-rbac-proxy-web/0.log" Apr 16 16:55:40.544213 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.544110 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/kube-rbac-proxy/0.log" Apr 16 16:55:40.617279 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.617243 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/prom-label-proxy/0.log" Apr 16 16:55:40.679808 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.679771 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/kube-rbac-proxy-rules/0.log" Apr 16 16:55:40.708705 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:40.708586 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67b7f774bc-wlc24_d9b9c6d2-d1b2-4297-970e-6174d592cccd/kube-rbac-proxy-metrics/0.log" Apr 16 16:55:41.910651 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:41.910617 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-jzjhf_b5859820-c3a7-4853-87f4-1a9946dbeaa1/networking-console-plugin/0.log" Apr 16 16:55:42.445678 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.445649 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/2.log" Apr 16 16:55:42.448498 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.448479 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-n874j_2e38bc16-772b-49da-b705-b184cb60f9bd/console-operator/1.log" Apr 16 16:55:42.659033 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.658969 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfzkj/must-gather-p5vw4" podStartSLOduration=10.964976956 podStartE2EDuration="11.658946744s" podCreationTimestamp="2026-04-16 16:55:31 +0000 UTC" firstStartedPulling="2026-04-16 16:55:31.503762975 +0000 UTC m=+1935.338615874" lastFinishedPulling="2026-04-16 16:55:32.197732776 +0000 UTC m=+1936.032585662" observedRunningTime="2026-04-16 16:55:32.580846406 +0000 UTC m=+1936.415699321" watchObservedRunningTime="2026-04-16 16:55:42.658946744 +0000 UTC m=+1946.493799651" Apr 16 16:55:42.662020 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.661986 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg"] Apr 16 16:55:42.667265 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.667241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.677999 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.677966 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg"] Apr 16 16:55:42.766026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.765550 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz2rt\" (UniqueName: \"kubernetes.io/projected/cea63149-5df0-4af8-b853-c06487078f98-kube-api-access-nz2rt\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.766026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.765621 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-lib-modules\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.766026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.765689 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-proc\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.766026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.765765 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-sys\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.766026 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.765858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-podres\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.866616 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.866576 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz2rt\" (UniqueName: \"kubernetes.io/projected/cea63149-5df0-4af8-b853-c06487078f98-kube-api-access-nz2rt\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.867222 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.867195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-lib-modules\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.867870 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.867853 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-proc\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.868067 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.868043 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-proc\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.868402 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.867715 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-lib-modules\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.868662 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.868634 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-sys\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.868821 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.868445 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-sys\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.868915 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.868844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-podres\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.869007 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.868991 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cea63149-5df0-4af8-b853-c06487078f98-podres\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.882003 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.876320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz2rt\" (UniqueName: \"kubernetes.io/projected/cea63149-5df0-4af8-b853-c06487078f98-kube-api-access-nz2rt\") pod \"perf-node-gather-daemonset-xh5tg\" (UID: \"cea63149-5df0-4af8-b853-c06487078f98\") " pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:42.967195 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.967161 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6d9946d7-vfltv_df7489c2-d9fe-42c5-b256-842a98898a25/console/0.log" Apr 16 16:55:42.979673 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:42.979641 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:43.019809 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.019686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-q95n6_3823e5d2-5cbd-4e2a-bf95-dacceea78679/download-server/0.log" Apr 16 16:55:43.157755 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.157726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg"] Apr 16 16:55:43.159186 ip-10-0-138-125 kubenswrapper[2571]: W0416 16:55:43.159157 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcea63149_5df0_4af8_b853_c06487078f98.slice/crio-977314ab2f73247bab54ab418621c5ed7b15147fc9f7b396c9b77a998fc58665 WatchSource:0}: Error finding container 977314ab2f73247bab54ab418621c5ed7b15147fc9f7b396c9b77a998fc58665: Status 404 returned error can't find the container with id 977314ab2f73247bab54ab418621c5ed7b15147fc9f7b396c9b77a998fc58665 Apr 16 16:55:43.506042 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.506017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-sqx2s_03a575af-ab57-48c7-9610-4e4bca8f14d2/volume-data-source-validator/0.log" Apr 16 16:55:43.622355 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.622251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" event={"ID":"cea63149-5df0-4af8-b853-c06487078f98","Type":"ContainerStarted","Data":"9210b8906956d604599445ae2ccbb1f10243b06451ca4ea866371d226849ee92"} Apr 16 16:55:43.622355 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.622288 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" event={"ID":"cea63149-5df0-4af8-b853-c06487078f98","Type":"ContainerStarted","Data":"977314ab2f73247bab54ab418621c5ed7b15147fc9f7b396c9b77a998fc58665"} Apr 16 16:55:43.622571 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.622381 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:43.653095 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:43.653031 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" podStartSLOduration=1.653013426 podStartE2EDuration="1.653013426s" podCreationTimestamp="2026-04-16 16:55:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:55:43.652129853 +0000 UTC m=+1947.486982758" watchObservedRunningTime="2026-04-16 16:55:43.653013426 +0000 UTC m=+1947.487866408" Apr 16 16:55:44.345081 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:44.345054 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-825fq_a8d51d9d-8870-4740-83f4-ac61a0da4fee/dns/0.log" Apr 16 16:55:44.367972 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:44.367944 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-825fq_a8d51d9d-8870-4740-83f4-ac61a0da4fee/kube-rbac-proxy/0.log" Apr 16 16:55:44.583308 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:44.583276 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9q4lm_037863f7-80dd-4ea9-9735-c27f4d903d1d/dns-node-resolver/0.log" Apr 16 16:55:45.140182 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:45.140153 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7th94_93ea3158-fb50-45bd-a3ff-a9af8b130de9/node-ca/0.log" Apr 16 16:55:46.139905 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:46.139872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6db657d5cd-98k8d_23b07769-d51e-41eb-bd71-9c987916dbd8/router/0.log" Apr 16 16:55:46.703792 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:46.703763 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-497mc_6fe47973-f1d6-4d8b-bcfc-e729c9709d5f/serve-healthcheck-canary/0.log" Apr 16 16:55:47.252246 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:47.252210 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-hxp97_aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a/insights-operator/1.log" Apr 16 16:55:47.252888 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:47.252868 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-hxp97_aa6d7ce4-f88b-42f5-b5ec-36b1c9e90e9a/insights-operator/0.log" Apr 16 16:55:47.466399 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:47.466360 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vcnhj_2b669da5-f0be-47de-9e64-383b411f4607/kube-rbac-proxy/0.log" Apr 16 16:55:47.539290 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:47.539213 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vcnhj_2b669da5-f0be-47de-9e64-383b411f4607/exporter/0.log" Apr 16 16:55:47.595833 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:47.595807 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vcnhj_2b669da5-f0be-47de-9e64-383b411f4607/extractor/0.log" Apr 16 16:55:49.637193 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:49.637161 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pfzkj/perf-node-gather-daemonset-xh5tg" Apr 16 16:55:50.438874 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:50.438837 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-846585b969-dp7jj_fdb886ae-75bd-4c87-9ef8-29bcf06d0306/manager/0.log" Apr 16 16:55:50.500083 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:50.500041 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-k5pnt_d325c16d-746b-4cdf-94e2-979b7831c5d3/openshift-lws-operator/0.log" Apr 16 16:55:51.189439 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:51.189409 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-966449c67-9z4cp_ee09a009-c8f0-4fba-9206-58596c2a7b93/manager/0.log" Apr 16 16:55:51.460838 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:51.460745 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-q5xtr_073d2f21-2a54-4b10-b12b-4a5daaa15777/manager/0.log" Apr 16 16:55:51.492747 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:51.492717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-84kpx_e9cf7092-889c-45ab-8613-967b93b85c04/s3-init/0.log" Apr 16 16:55:56.784279 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:56.784247 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-98gh8_36c0655a-39bf-41ac-a093-9d9b0567949f/migrator/0.log" Apr 16 16:55:56.810529 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:56.810502 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-98gh8_36c0655a-39bf-41ac-a093-9d9b0567949f/graceful-termination/0.log" Apr 16 16:55:58.388987 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:58.388960 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7gghf_6840e957-0163-4053-b7e6-599a98718065/kube-multus/0.log" Apr 16 16:55:58.906819 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:58.906792 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/kube-multus-additional-cni-plugins/0.log" Apr 16 16:55:58.933481 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:58.933450 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/egress-router-binary-copy/0.log" Apr 16 16:55:58.982597 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:58.982559 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/cni-plugins/0.log" Apr 16 16:55:59.002043 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.002017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/bond-cni-plugin/0.log" Apr 16 16:55:59.025357 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.025323 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/routeoverride-cni/0.log" Apr 16 16:55:59.049588 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.049554 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/whereabouts-cni-bincopy/0.log" Apr 16 16:55:59.072788 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.072762 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-f9km4_6e96b3ac-b7d4-44c4-92c5-7706938e5538/whereabouts-cni/0.log" Apr 16 16:55:59.240187 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.240068 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mtw25_073e645f-92a9-4855-9057-6a125ec9ebda/network-metrics-daemon/0.log" Apr 16 16:55:59.273734 ip-10-0-138-125 kubenswrapper[2571]: I0416 16:55:59.273702 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mtw25_073e645f-92a9-4855-9057-6a125ec9ebda/kube-rbac-proxy/0.log"