Apr 16 14:52:52.228107 ip-10-0-137-160 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:52:52.668902 ip-10-0-137-160 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:52.668902 ip-10-0-137-160 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:52:52.668902 ip-10-0-137-160 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:52.668902 ip-10-0-137-160 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:52:52.668902 ip-10-0-137-160 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:52:52.669670 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.669592 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:52:52.674426 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674406 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:52.674426 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674423 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:52.674426 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674427 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:52.674426 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674431 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674435 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674437 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674440 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674443 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674446 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674449 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674452 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674454 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674457 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674459 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674462 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674465 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674469 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674473 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674476 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674479 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674482 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674484 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674487 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:52.674601 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674490 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674493 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674496 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674503 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674506 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674509 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674511 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674514 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674517 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674520 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674522 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674525 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674528 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674530 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674533 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674536 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674539 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674541 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674544 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674547 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:52.675099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674549 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674552 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674554 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674557 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674559 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674562 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674564 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674567 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674570 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674572 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674576 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674579 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674581 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674584 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674587 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674589 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674593 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674596 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674599 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:52.675600 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674601 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674604 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674607 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674609 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674612 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674614 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674617 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674621 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674624 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674627 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674629 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674632 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674634 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674637 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674640 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674642 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674646 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674649 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674652 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674655 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:52.676100 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674658 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674660 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674663 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.674665 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675068 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675073 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675076 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675078 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675081 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675084 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675087 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675090 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675092 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675095 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675098 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675100 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675103 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675106 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675109 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675113 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:52.676571 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675116 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675119 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675121 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675124 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675126 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675129 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675132 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675134 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675137 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675140 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675142 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675145 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675147 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675150 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675152 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675155 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675157 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675160 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675162 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675165 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:52.677103 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675167 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675170 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675175 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675178 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675181 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675184 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675187 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675189 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675192 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675195 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675198 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675200 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675203 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675206 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675208 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675210 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675213 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675215 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675218 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:52.677622 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675221 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675223 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675226 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675229 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675231 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675234 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675236 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675239 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675241 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675249 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675252 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675255 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675257 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675261 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675263 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675266 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675268 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675271 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675273 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675276 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:52.678102 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675278 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675281 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675284 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675286 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675289 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675292 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675294 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675297 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675299 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675302 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.675304 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676391 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676401 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676407 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676411 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676416 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676420 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676424 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676428 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676432 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676435 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:52:52.678591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676438 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676441 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676444 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676447 2580 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676450 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676453 2580 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676456 2580 flags.go:64] FLAG: --cloud-config="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676458 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676461 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676466 2580 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676469 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676472 2580 flags.go:64] FLAG: --config-dir="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676474 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676478 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676482 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676485 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676489 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676492 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676495 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676497 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676500 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676504 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676506 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676510 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676514 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:52:52.679117 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676516 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676519 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676522 2580 flags.go:64] FLAG: --enable-server="true" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676525 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676530 2580 flags.go:64] FLAG: --event-burst="100" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676533 2580 flags.go:64] FLAG: --event-qps="50" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676536 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676539 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676542 2580 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676546 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676548 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676551 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676554 2580 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676557 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676560 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676562 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676565 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676568 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676571 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676574 2580 flags.go:64] FLAG: --feature-gates="" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676578 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676581 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676585 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676589 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676592 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:52:52.679723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676595 2580 flags.go:64] FLAG: --help="false" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676598 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-137-160.ec2.internal" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676601 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676604 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676607 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676610 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676613 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676616 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676619 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676622 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676625 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676628 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676631 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676634 2580 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676637 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676639 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676642 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676645 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676648 2580 flags.go:64] FLAG: --lock-file="" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676650 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676653 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676656 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676666 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:52:52.680328 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676669 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676672 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676674 2580 flags.go:64] FLAG: --logging-format="text" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676677 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676681 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676684 2580 flags.go:64] FLAG: --manifest-url="" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676687 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676691 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676694 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676699 2580 flags.go:64] FLAG: --max-pods="110" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676702 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676705 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676708 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676711 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676714 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676716 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676719 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676726 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676729 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676732 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676736 2580 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676739 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676744 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676747 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:52:52.680909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676750 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676752 2580 flags.go:64] FLAG: --port="10250" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676755 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676758 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0eee25f26ebc9ccf2" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676761 2580 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676764 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676767 2580 flags.go:64] FLAG: --register-node="true" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676770 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676773 2580 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676776 2580 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676779 2580 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676782 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676785 2580 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676789 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676793 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676796 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676798 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676801 2580 flags.go:64] FLAG: --runonce="false" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676804 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676807 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676810 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676812 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676824 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676827 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676831 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676834 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:52:52.681481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676852 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676856 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676858 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676862 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676865 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676868 2580 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676871 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676876 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676879 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676882 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676886 2580 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676889 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676892 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676894 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676897 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676900 2580 flags.go:64] FLAG: --v="2" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676904 2580 flags.go:64] FLAG: --version="false" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676908 2580 flags.go:64] FLAG: --vmodule="" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676912 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.676919 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677013 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677016 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677019 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677022 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:52.682108 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677024 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677027 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677029 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677032 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677035 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677037 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677041 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677045 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677049 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677051 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677054 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677058 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677063 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677066 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677068 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677071 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677073 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677076 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677078 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:52.682676 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677081 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677083 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677086 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677088 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677091 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677093 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677095 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677098 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677102 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677106 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677109 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677112 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677114 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677117 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677120 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677122 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677125 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677127 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677130 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677132 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:52.683197 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677134 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677137 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677139 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677142 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677144 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677147 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677150 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677153 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677155 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677158 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677160 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677162 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677165 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677167 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677169 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677172 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677174 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677177 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677179 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677182 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:52.683730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677186 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677190 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677192 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677195 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677197 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677200 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677203 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677205 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677208 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677211 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677213 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677216 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677218 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677220 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677223 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677225 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677228 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677230 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677233 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677236 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:52.684256 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677238 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:52.684739 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677241 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:52.684739 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.677243 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:52.684739 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.677896 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:52.685860 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.685826 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:52:52.685900 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.685860 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685906 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685912 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685915 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685918 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685921 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685924 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685926 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685929 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685931 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:52.685931 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685934 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685937 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685940 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685942 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685945 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685948 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685950 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685953 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685956 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685958 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685962 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685965 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685967 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685970 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685972 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685975 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685978 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685981 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685983 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685985 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:52.686178 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685988 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685990 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685994 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685996 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.685999 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686001 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686004 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686006 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686009 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686011 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686013 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686016 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686018 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686021 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686024 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686027 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686029 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686032 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686034 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686037 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:52.686652 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686039 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686042 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686044 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686047 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686050 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686052 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686055 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686057 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686060 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686062 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686064 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686067 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686069 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686072 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686074 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686077 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686080 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686082 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686086 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:52.687145 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686090 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686093 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686096 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686098 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686101 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686103 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686106 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686110 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686112 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686116 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686120 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686123 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686125 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686128 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686131 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686133 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686136 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:52.687636 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686139 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.686144 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686243 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686249 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686251 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686254 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686257 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686260 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686263 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686266 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686268 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686271 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686274 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686277 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686279 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686282 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:52:52.688080 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686285 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686287 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686290 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686292 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686294 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686297 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686300 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686304 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686308 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686310 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686313 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686316 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686318 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686321 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686324 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686326 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686329 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686332 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686334 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:52:52.688464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686336 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686339 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686341 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686344 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686346 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686349 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686351 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686353 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686356 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686358 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686361 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686363 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686366 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686368 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686371 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686373 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686375 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686378 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686380 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686384 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:52:52.688947 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686386 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686389 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686391 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686394 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686396 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686398 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686401 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686403 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686406 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686408 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686410 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686413 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686415 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686418 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686420 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686423 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686425 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686428 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686430 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686432 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:52:52.689422 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686435 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686437 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686440 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686444 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686447 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686449 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686451 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686454 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686457 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686459 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686461 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686464 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:52.686467 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.686472 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.687110 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:52:52.689976 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.689436 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:52:52.690485 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.690474 2580 server.go:1019] "Starting client certificate rotation" Apr 16 14:52:52.690593 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.690574 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:52.690644 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.690619 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:52:52.714983 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.714962 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:52.717098 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.717081 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:52:52.729909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.729893 2580 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:52:52.734885 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.734872 2580 log.go:25] "Validated CRI v1 image API" Apr 16 14:52:52.736148 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.736134 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:52:52.743980 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.743961 2580 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 803dfe65-27f9-4213-a6f6-f664b972b43c:/dev/nvme0n1p4 c53ea91d-9b5c-46df-9bbe-65027eb4866e:/dev/nvme0n1p3] Apr 16 14:52:52.744049 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.743981 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:52:52.749921 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.749906 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:52.750180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.750072 2580 manager.go:217] Machine: {Timestamp:2026-04-16 14:52:52.74814027 +0000 UTC m=+0.401522634 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3155577 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec216123bd02d26c49bf7234b07cd48f SystemUUID:ec216123-bd02-d26c-49bf-7234b07cd48f BootID:2a4d7068-59e4-43b7-b59b-81cf86ea588e Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:c6:d5:88:25 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:c6:d5:88:25 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:76:89:06:d2:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:52:52.750180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.750176 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:52:52.750274 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.750250 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:52:52.752483 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.752455 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:52:52.752621 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.752486 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-160.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:52:52.752668 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.752630 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:52:52.752668 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.752639 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:52:52.752668 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.752652 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:52.753500 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.753488 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:52:52.754709 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.754700 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:52.754820 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.754811 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:52:52.756964 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.756954 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:52:52.756998 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.756972 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:52:52.756998 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.756984 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:52:52.756998 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.756993 2580 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:52:52.757124 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.757000 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:52:52.757937 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.757922 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:52.757937 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.757940 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:52:52.762304 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.762282 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:52:52.764072 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.764055 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:52:52.765170 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765154 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765176 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765194 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765203 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765212 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765228 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765237 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:52:52.765246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765247 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:52:52.765474 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765259 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:52:52.765474 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765269 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:52:52.765474 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765283 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:52:52.765474 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.765297 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:52:52.767115 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.767088 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:52:52.767166 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.767121 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:52:52.770062 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.770032 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-160.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:52:52.770148 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.770126 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:52:52.770267 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.770251 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:52:52.770796 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.770783 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:52:52.770863 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.770819 2580 server.go:1295] "Started kubelet" Apr 16 14:52:52.770935 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.770916 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:52:52.771024 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.770980 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:52:52.771058 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.771048 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:52:52.771613 ip-10-0-137-160 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:52:52.772262 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.772215 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:52:52.772749 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.772738 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:52:52.776993 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.776969 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5j5fv" Apr 16 14:52:52.777937 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.777918 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:52:52.778023 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.777942 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:52.778593 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.778576 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:52:52.778672 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.778579 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:52:52.778672 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.778621 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:52:52.778771 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.778706 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:52:52.778771 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.778717 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:52:52.779046 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.778994 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:52.780117 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.779047 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-160.ec2.internal.18a6ddfae95a1438 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-160.ec2.internal,UID:ip-10-0-137-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-160.ec2.internal,},FirstTimestamp:2026-04-16 14:52:52.7707966 +0000 UTC m=+0.424178965,LastTimestamp:2026-04-16 14:52:52.7707966 +0000 UTC m=+0.424178965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-160.ec2.internal,}" Apr 16 14:52:52.780741 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.780724 2580 factory.go:55] Registering systemd factory Apr 16 14:52:52.780940 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.780922 2580 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:52:52.781290 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.781271 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:52:52.781368 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781333 2580 factory.go:153] Registering CRI-O factory Apr 16 14:52:52.781368 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781342 2580 factory.go:223] Registration of the crio container factory successfully Apr 16 14:52:52.781449 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781386 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:52:52.781449 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781404 2580 factory.go:103] Registering Raw factory Apr 16 14:52:52.781449 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781413 2580 manager.go:1196] Started watching for new ooms in manager Apr 16 14:52:52.781723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.781706 2580 manager.go:319] Starting recovery of all containers Apr 16 14:52:52.786174 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.786021 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5j5fv" Apr 16 14:52:52.786396 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.786371 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 14:52:52.786638 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.786516 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 14:52:52.792144 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.792124 2580 manager.go:324] Recovery completed Apr 16 14:52:52.795178 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.795144 2580 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 14:52:52.798246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.798233 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:52.800448 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800430 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:52.800519 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800460 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:52.800519 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800470 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:52.800983 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800967 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:52:52.800983 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800982 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:52:52.801073 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.800997 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:52:52.802582 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.802524 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-160.ec2.internal.18a6ddfaeb1e82fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-160.ec2.internal,UID:ip-10-0-137-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-160.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-160.ec2.internal,},FirstTimestamp:2026-04-16 14:52:52.800447228 +0000 UTC m=+0.453829592,LastTimestamp:2026-04-16 14:52:52.800447228 +0000 UTC m=+0.453829592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-160.ec2.internal,}" Apr 16 14:52:52.803096 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.803085 2580 policy_none.go:49] "None policy: Start" Apr 16 14:52:52.803134 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.803101 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:52:52.803134 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.803111 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:52:52.841480 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841466 2580 manager.go:341] "Starting Device Plugin manager" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.841503 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841519 2580 server.go:85] "Starting device plugin registration server" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841729 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841740 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841811 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841903 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.841914 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.842440 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:52:52.859022 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.842484 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:52.929000 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.928941 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:52:52.930257 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.930243 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:52:52.930350 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.930265 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:52:52.930350 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.930282 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:52:52.930350 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.930288 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:52:52.930350 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.930317 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:52:52.934663 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.934648 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:52.942851 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.942816 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:52.943781 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.943768 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:52.943820 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.943795 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:52.943820 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.943805 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:52.943901 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.943826 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-160.ec2.internal" Apr 16 14:52:52.954909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:52.954893 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-160.ec2.internal" Apr 16 14:52:52.954963 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.954913 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-160.ec2.internal\": node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:52.992235 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:52.992216 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.031307 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.031286 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal"] Apr 16 14:52:53.031378 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.031357 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:53.032098 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.032077 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:53.032177 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.032110 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:53.032177 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.032121 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:53.033366 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.033355 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:53.033494 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.033483 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.033528 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.033509 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:53.034063 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034047 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:53.034143 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034073 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:53.034143 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034089 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:53.034143 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034055 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:53.034248 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034144 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:53.034248 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.034155 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:53.035610 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.035596 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.035652 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.035624 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:52:53.036258 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.036243 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:52:53.036333 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.036274 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:52:53.036333 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.036287 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:52:53.065404 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.065385 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-160.ec2.internal\" not found" node="ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.069795 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.069781 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-160.ec2.internal\" not found" node="ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.079982 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.079930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.080059 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.079994 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.080059 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.080012 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cc6f13ece139d4b95b6457b753b2eb6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-160.ec2.internal\" (UID: \"0cc6f13ece139d4b95b6457b753b2eb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.093133 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.093115 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.180942 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.180942 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180923 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.180942 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180941 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cc6f13ece139d4b95b6457b753b2eb6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-160.ec2.internal\" (UID: \"0cc6f13ece139d4b95b6457b753b2eb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.181051 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0cc6f13ece139d4b95b6457b753b2eb6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-160.ec2.internal\" (UID: \"0cc6f13ece139d4b95b6457b753b2eb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.181051 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.181051 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.180996 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/28881687908b35bd40a7465b6e8c523f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal\" (UID: \"28881687908b35bd40a7465b6e8c523f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.194017 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.193998 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.294891 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.294855 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.368069 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.368043 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.372438 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.372423 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:53.395319 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.395292 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.495975 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.495881 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.596378 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.596344 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.669668 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.669645 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:53.690332 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.690314 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:52:53.690432 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.690418 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:53.690474 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.690462 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:52:53.697462 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.697444 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.779122 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.779063 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:52:53.787771 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.787754 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:52:53.787771 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.787759 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:47:52 +0000 UTC" deadline="2027-12-04 06:07:17.023198333 +0000 UTC" Apr 16 14:52:53.787914 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.787776 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14319h14m23.235424953s" Apr 16 14:52:53.797880 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.797864 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.816075 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.816057 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-v6dvt" Apr 16 14:52:53.827667 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.827646 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-v6dvt" Apr 16 14:52:53.898129 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.898107 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:53.906249 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:53.906217 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28881687908b35bd40a7465b6e8c523f.slice/crio-aa137bccbf12aee64733fe205f3f89cf2f11d20d7fff1e24784dfdbe2060adf9 WatchSource:0}: Error finding container aa137bccbf12aee64733fe205f3f89cf2f11d20d7fff1e24784dfdbe2060adf9: Status 404 returned error can't find the container with id aa137bccbf12aee64733fe205f3f89cf2f11d20d7fff1e24784dfdbe2060adf9 Apr 16 14:52:53.910099 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:53.910077 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cc6f13ece139d4b95b6457b753b2eb6.slice/crio-35f57cdfba7d3109adb7bfa99cd235d91dc87cc023c5da3d63a0dc124be50922 WatchSource:0}: Error finding container 35f57cdfba7d3109adb7bfa99cd235d91dc87cc023c5da3d63a0dc124be50922: Status 404 returned error can't find the container with id 35f57cdfba7d3109adb7bfa99cd235d91dc87cc023c5da3d63a0dc124be50922 Apr 16 14:52:53.911037 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.911020 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:53.932869 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.932805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" event={"ID":"0cc6f13ece139d4b95b6457b753b2eb6","Type":"ContainerStarted","Data":"35f57cdfba7d3109adb7bfa99cd235d91dc87cc023c5da3d63a0dc124be50922"} Apr 16 14:52:53.933619 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:53.933600 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" event={"ID":"28881687908b35bd40a7465b6e8c523f","Type":"ContainerStarted","Data":"aa137bccbf12aee64733fe205f3f89cf2f11d20d7fff1e24784dfdbe2060adf9"} Apr 16 14:52:53.999056 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:53.999030 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-160.ec2.internal\" not found" Apr 16 14:52:54.082661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.082619 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:54.179257 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.179229 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" Apr 16 14:52:54.189178 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.189163 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:54.190064 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.190051 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" Apr 16 14:52:54.199183 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.199168 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:52:54.304659 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.304633 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:54.599614 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.599585 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:54.758151 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.758117 2580 apiserver.go:52] "Watching apiserver" Apr 16 14:52:54.764160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.764136 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:52:54.764550 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.764522 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-wvkzh","openshift-network-operator/iptables-alerter-5dfrs","openshift-ovn-kubernetes/ovnkube-node-czzzx","kube-system/konnectivity-agent-9dmjs","openshift-image-registry/node-ca-qpspz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal","openshift-multus/multus-68rx2","openshift-multus/multus-additional-cni-plugins-w4dhd","openshift-multus/network-metrics-daemon-kgs47","kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst","openshift-cluster-node-tuning-operator/tuned-bs8ms","openshift-dns/node-resolver-qswpg"] Apr 16 14:52:54.767576 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.767557 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.770127 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.770097 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.770127 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.770114 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.770255 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.770193 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:52:54.770331 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.770319 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qk77r\"" Apr 16 14:52:54.771723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.771700 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.773646 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.773631 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-h65jw\"" Apr 16 14:52:54.773736 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.773704 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.773800 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.773745 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:52:54.773912 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.773893 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.774110 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.774093 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.776310 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.776028 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.776310 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.776037 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nqc6k\"" Apr 16 14:52:54.776310 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.776102 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.776717 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.776700 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.778730 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.778710 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x74kr\"" Apr 16 14:52:54.778825 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.778752 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.778825 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.778720 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.779183 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.779163 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.779290 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.779271 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:54.779402 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.779320 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:52:54.781124 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.781056 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.781124 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.781084 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-lcdvs\"" Apr 16 14:52:54.781259 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.781065 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.781468 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.781447 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.782015 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.781995 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:52:54.782086 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.782024 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:52:54.783541 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.783516 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:52:54.788758 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.787364 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:52:54.788758 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.787424 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:54.788758 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.787510 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:52:54.788758 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.787626 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wdc67\"" Apr 16 14:52:54.789908 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.789889 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.789990 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.789893 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-hostroot\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.789990 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.789969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-multus-certs\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.789987 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.790091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.789997 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-etc-kubernetes\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790023 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:54.790091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790048 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48dde1ba-8911-4c19-9083-79bd3339f3bf-tmp-dir\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.790091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790082 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-lib-modules\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790113 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-conf-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbkhl\" (UniqueName: \"kubernetes.io/projected/70c41107-b96d-429c-a82c-270215f0994f-kube-api-access-tbkhl\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790174 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-conf\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-run\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-multus\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790240 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-kubelet\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzdmt\" (UniqueName: \"kubernetes.io/projected/475f512a-706c-424b-b38f-428bf1b64f69-kube-api-access-nzdmt\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790304 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-registration-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790325 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-device-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790363 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysconfig\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-sys\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790427 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-multus-daemon-config\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790454 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-etc-selinux\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70c41107-b96d-429c-a82c-270215f0994f-host-slash\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-tuned\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790597 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-cnibin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-socket-dir-parent\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790645 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-bin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790669 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-socket-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.790688 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790691 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-sys-fs\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790714 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70c41107-b96d-429c-a82c-270215f0994f-iptables-alerter-script\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790739 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vhk\" (UniqueName: \"kubernetes.io/projected/48dde1ba-8911-4c19-9083-79bd3339f3bf-kube-api-access-m4vhk\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790765 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-tmp\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-cni-binary-copy\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790810 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-k8s-cni-cncf-io\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-netns\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790940 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b8wv\" (UniqueName: \"kubernetes.io/projected/7347b36a-63d3-4952-9fcc-7bc501135de9-kube-api-access-4b8wv\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-modprobe-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.790987 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-kubernetes\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791021 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-var-lib-kubelet\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48dde1ba-8911-4c19-9083-79bd3339f3bf-hosts-file\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791106 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-systemd\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791153 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-host\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lncbr\" (UniqueName: \"kubernetes.io/projected/b533746b-70b1-42cc-ab44-8b3907cf75a3-kube-api-access-lncbr\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791204 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-system-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791224 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791832 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791245 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-os-release\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.791832 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.791271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.792129 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792105 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.792216 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792125 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:52:54.792279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792222 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5ds8c\"" Apr 16 14:52:54.792327 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792282 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:52:54.792392 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792373 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:52:54.792468 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792447 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.792692 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792673 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.792769 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.792709 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-t8kzr\"" Apr 16 14:52:54.794615 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.794455 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:52:54.794615 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.794508 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:52:54.794615 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.794517 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:52:54.794915 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.794899 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:52:54.795077 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.795060 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nvq22\"" Apr 16 14:52:54.795212 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.795201 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:52:54.795434 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.795416 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:52:54.828378 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.828353 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:53 +0000 UTC" deadline="2027-12-01 17:13:14.723293776 +0000 UTC" Apr 16 14:52:54.828378 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.828377 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14258h20m19.894919204s" Apr 16 14:52:54.879362 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.879344 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:52:54.891607 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-registration-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.891757 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-device-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.891833 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891766 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-sys\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.891833 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891690 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-registration-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.891833 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-node-log\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.891833 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891821 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovn-node-metrics-cert\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-device-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891887 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70c41107-b96d-429c-a82c-270215f0994f-host-slash\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-sys\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70c41107-b96d-429c-a82c-270215f0994f-host-slash\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-tuned\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891959 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-cnibin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.891984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-socket-dir-parent\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-bin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892056 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-bin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892064 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-socket-dir-parent\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892069 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6k4\" (UniqueName: \"kubernetes.io/projected/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-kube-api-access-zs6k4\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892078 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-cnibin\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892103 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-sys-fs\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70c41107-b96d-429c-a82c-270215f0994f-iptables-alerter-script\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-tmp\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892182 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-cni-binary-copy\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892195 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-sys-fs\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-host\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892216 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892234 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b8wv\" (UniqueName: \"kubernetes.io/projected/7347b36a-63d3-4952-9fcc-7bc501135de9-kube-api-access-4b8wv\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892276 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-kubernetes\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892311 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-var-lib-kubelet\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892346 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-netns\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892357 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-kubernetes\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.892404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892371 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892435 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-var-lib-kubelet\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892458 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tch8q\" (UniqueName: \"kubernetes.io/projected/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-kube-api-access-tch8q\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892503 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-host\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-system-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892572 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-os-release\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892596 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-netd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-host\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cnibin\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892624 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892651 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc8dh\" (UniqueName: \"kubernetes.io/projected/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-kube-api-access-wc8dh\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-etc-kubernetes\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-system-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-slash\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892696 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-os-release\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892712 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70c41107-b96d-429c-a82c-270215f0994f-iptables-alerter-script\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892745 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-env-overrides\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892748 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-etc-kubernetes\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892771 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892774 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-cni-binary-copy\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-lib-modules\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-agent-certs\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892883 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-systemd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892914 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-lib-modules\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbkhl\" (UniqueName: \"kubernetes.io/projected/70c41107-b96d-429c-a82c-270215f0994f-kube-api-access-tbkhl\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.892947 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-run\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893107 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-run\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-kubelet\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893212 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-kubelet\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893264 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzdmt\" (UniqueName: \"kubernetes.io/projected/475f512a-706c-424b-b38f-428bf1b64f69-kube-api-access-nzdmt\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893288 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-serviceca\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.893661 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893309 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-system-cni-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysconfig\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893372 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysconfig\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893382 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-multus-daemon-config\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893406 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-kubelet\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893425 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-etc-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893445 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-log-socket\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893468 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-config\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-etc-selinux\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-socket-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893583 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-etc-selinux\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893622 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vhk\" (UniqueName: \"kubernetes.io/projected/48dde1ba-8911-4c19-9083-79bd3339f3bf-kube-api-access-m4vhk\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893639 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-socket-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893651 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-k8s-cni-cncf-io\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893676 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-netns\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893702 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-bin\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-k8s-cni-cncf-io\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894293 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893730 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-modprobe-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893735 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-netns\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893759 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-konnectivity-ca\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893813 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/475f512a-706c-424b-b38f-428bf1b64f69-multus-daemon-config\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-systemd-units\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-modprobe-d\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893870 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48dde1ba-8911-4c19-9083-79bd3339f3bf-hosts-file\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893903 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-systemd\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lncbr\" (UniqueName: \"kubernetes.io/projected/b533746b-70b1-42cc-ab44-8b3907cf75a3-kube-api-access-lncbr\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893961 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-systemd\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893965 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48dde1ba-8911-4c19-9083-79bd3339f3bf-hosts-file\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.893973 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-multus-certs\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894002 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-run-multus-certs\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894001 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-os-release\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894021 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-cni-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894037 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.894802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894056 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894071 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-hostroot\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894086 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-ovn\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894102 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894106 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7347b36a-63d3-4952-9fcc-7bc501135de9-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894140 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-hostroot\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894148 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48dde1ba-8911-4c19-9083-79bd3339f3bf-tmp-dir\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894162 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-conf-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894178 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-script-lib\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894328 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-multus-conf-dir\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-conf\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894374 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-multus\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-var-lib-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m7t8\" (UniqueName: \"kubernetes.io/projected/4c7680da-cb3c-4ad2-b143-8ff457f88efe-kube-api-access-6m7t8\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894439 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/475f512a-706c-424b-b38f-428bf1b64f69-host-var-lib-cni-multus\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.895464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894481 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48dde1ba-8911-4c19-9083-79bd3339f3bf-tmp-dir\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.896279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.894492 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-sysctl-conf\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.896279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.895418 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-tmp\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.896279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.895465 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b533746b-70b1-42cc-ab44-8b3907cf75a3-etc-tuned\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.900636 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.900574 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:54.900636 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.900602 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:54.900636 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.900615 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:54.900636 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.900697 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:55.400666029 +0000 UTC m=+3.054048402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:54.902518 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.902494 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzdmt\" (UniqueName: \"kubernetes.io/projected/475f512a-706c-424b-b38f-428bf1b64f69-kube-api-access-nzdmt\") pod \"multus-68rx2\" (UID: \"475f512a-706c-424b-b38f-428bf1b64f69\") " pod="openshift-multus/multus-68rx2" Apr 16 14:52:54.902621 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.902502 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vhk\" (UniqueName: \"kubernetes.io/projected/48dde1ba-8911-4c19-9083-79bd3339f3bf-kube-api-access-m4vhk\") pod \"node-resolver-qswpg\" (UID: \"48dde1ba-8911-4c19-9083-79bd3339f3bf\") " pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:54.902684 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.902643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbkhl\" (UniqueName: \"kubernetes.io/projected/70c41107-b96d-429c-a82c-270215f0994f-kube-api-access-tbkhl\") pod \"iptables-alerter-5dfrs\" (UID: \"70c41107-b96d-429c-a82c-270215f0994f\") " pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:54.903244 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.903218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lncbr\" (UniqueName: \"kubernetes.io/projected/b533746b-70b1-42cc-ab44-8b3907cf75a3-kube-api-access-lncbr\") pod \"tuned-bs8ms\" (UID: \"b533746b-70b1-42cc-ab44-8b3907cf75a3\") " pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:54.904121 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.904101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b8wv\" (UniqueName: \"kubernetes.io/projected/7347b36a-63d3-4952-9fcc-7bc501135de9-kube-api-access-4b8wv\") pod \"aws-ebs-csi-driver-node-7zbst\" (UID: \"7347b36a-63d3-4952-9fcc-7bc501135de9\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:54.995663 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-var-lib-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m7t8\" (UniqueName: \"kubernetes.io/projected/4c7680da-cb3c-4ad2-b143-8ff457f88efe-kube-api-access-6m7t8\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-node-log\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-var-lib-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovn-node-metrics-cert\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6k4\" (UniqueName: \"kubernetes.io/projected/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-kube-api-access-zs6k4\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.995806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995793 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-node-log\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995804 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995862 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-host\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995891 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-netns\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995969 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tch8q\" (UniqueName: \"kubernetes.io/projected/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-kube-api-access-tch8q\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.995998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-netd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cnibin\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc8dh\" (UniqueName: \"kubernetes.io/projected/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-kube-api-access-wc8dh\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996106 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-slash\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996108 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-netns\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996113 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996109 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-run-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996121 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-host\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-env-overrides\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996176 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-netd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-agent-certs\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996221 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cnibin\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-systemd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-systemd\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-slash\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996390 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-binary-copy\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996500 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-serviceca\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996525 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-system-cni-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996552 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-kubelet\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-etc-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996602 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-log-socket\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.996698 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-config\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-bin\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996693 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-konnectivity-ca\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996737 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996748 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-systemd-units\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-systemd-units\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996795 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-os-release\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996829 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-os-release\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-ovn\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996903 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-env-overrides\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-log-socket\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-run-ovn\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-kubelet\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-script-lib\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.996980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-serviceca\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.997110 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:54.997511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997030 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-etc-openvswitch\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997203 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997035 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-system-cni-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:54.997170 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:55.497154363 +0000 UTC m=+3.150536736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997403 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-host-cni-bin\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997464 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-konnectivity-ca\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997487 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-config\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997503 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c7680da-cb3c-4ad2-b143-8ff457f88efe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c7680da-cb3c-4ad2-b143-8ff457f88efe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:54.998180 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.997874 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovnkube-script-lib\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998523 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.998412 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-ovn-node-metrics-cert\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:54.998898 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:54.998880 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9613bf50-dbc0-4e6d-aeba-8f63da3babdb-agent-certs\") pod \"konnectivity-agent-9dmjs\" (UID: \"9613bf50-dbc0-4e6d-aeba-8f63da3babdb\") " pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:55.008749 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.008713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc8dh\" (UniqueName: \"kubernetes.io/projected/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-kube-api-access-wc8dh\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:55.008878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.008861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tch8q\" (UniqueName: \"kubernetes.io/projected/8e0c1234-5484-4c8c-9e5f-c0d64478ef21-kube-api-access-tch8q\") pod \"ovnkube-node-czzzx\" (UID: \"8e0c1234-5484-4c8c-9e5f-c0d64478ef21\") " pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:55.009737 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.009709 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m7t8\" (UniqueName: \"kubernetes.io/projected/4c7680da-cb3c-4ad2-b143-8ff457f88efe-kube-api-access-6m7t8\") pod \"multus-additional-cni-plugins-w4dhd\" (UID: \"4c7680da-cb3c-4ad2-b143-8ff457f88efe\") " pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:55.010130 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.010113 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6k4\" (UniqueName: \"kubernetes.io/projected/9ab7c27b-be98-41cf-bbea-2ed5ab71d83f-kube-api-access-zs6k4\") pod \"node-ca-qpspz\" (UID: \"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f\") " pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:55.078690 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.078659 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" Apr 16 14:52:55.087563 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.087535 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5dfrs" Apr 16 14:52:55.098185 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.098164 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qswpg" Apr 16 14:52:55.103828 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.103806 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" Apr 16 14:52:55.114489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.114470 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-68rx2" Apr 16 14:52:55.120937 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.120919 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" Apr 16 14:52:55.126560 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.126541 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:52:55.133097 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.133054 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qpspz" Apr 16 14:52:55.138584 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.138570 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:52:55.400773 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.400691 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:55.400928 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.400884 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:55.400928 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.400906 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:55.400928 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.400915 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:55.401092 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.400971 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.400954323 +0000 UTC m=+4.054336702 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:55.501068 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.501047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:55.501175 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.501164 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:55.501219 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.501214 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:56.501201568 +0000 UTC m=+4.154583919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:55.514901 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.514819 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0c1234_5484_4c8c_9e5f_c0d64478ef21.slice/crio-d0ae8e8416dba8535e8e7edf4e2f93bec61a13ae8710c783ac7d468875bc3b43 WatchSource:0}: Error finding container d0ae8e8416dba8535e8e7edf4e2f93bec61a13ae8710c783ac7d468875bc3b43: Status 404 returned error can't find the container with id d0ae8e8416dba8535e8e7edf4e2f93bec61a13ae8710c783ac7d468875bc3b43 Apr 16 14:52:55.516001 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.515937 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7680da_cb3c_4ad2_b143_8ff457f88efe.slice/crio-bdb02f74c8dd139509320a63ecc1b196b299e644c1a86e41c4558d3d76d9f36b WatchSource:0}: Error finding container bdb02f74c8dd139509320a63ecc1b196b299e644c1a86e41c4558d3d76d9f36b: Status 404 returned error can't find the container with id bdb02f74c8dd139509320a63ecc1b196b299e644c1a86e41c4558d3d76d9f36b Apr 16 14:52:55.519547 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.519479 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb533746b_70b1_42cc_ab44_8b3907cf75a3.slice/crio-51f84ae74dfeffc3da06e400412f84e5eff4a38ae6ecc7abbe2bb622a73b7e6c WatchSource:0}: Error finding container 51f84ae74dfeffc3da06e400412f84e5eff4a38ae6ecc7abbe2bb622a73b7e6c: Status 404 returned error can't find the container with id 51f84ae74dfeffc3da06e400412f84e5eff4a38ae6ecc7abbe2bb622a73b7e6c Apr 16 14:52:55.520137 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.520118 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod475f512a_706c_424b_b38f_428bf1b64f69.slice/crio-597549946becf53d2cba6c445ff3a2a2aec15014999bbc0c8a159a8a29aba697 WatchSource:0}: Error finding container 597549946becf53d2cba6c445ff3a2a2aec15014999bbc0c8a159a8a29aba697: Status 404 returned error can't find the container with id 597549946becf53d2cba6c445ff3a2a2aec15014999bbc0c8a159a8a29aba697 Apr 16 14:52:55.521464 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.521443 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7347b36a_63d3_4952_9fcc_7bc501135de9.slice/crio-58c9fd71fc29fafd2be6371beaff8c672256ee60c7d2e2d75857bf53c9d46c4d WatchSource:0}: Error finding container 58c9fd71fc29fafd2be6371beaff8c672256ee60c7d2e2d75857bf53c9d46c4d: Status 404 returned error can't find the container with id 58c9fd71fc29fafd2be6371beaff8c672256ee60c7d2e2d75857bf53c9d46c4d Apr 16 14:52:55.522975 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.522949 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab7c27b_be98_41cf_bbea_2ed5ab71d83f.slice/crio-83106444a5f02e88c0b89d3f69a1b528d5c57e0652836c1a06ffbdae535d5af1 WatchSource:0}: Error finding container 83106444a5f02e88c0b89d3f69a1b528d5c57e0652836c1a06ffbdae535d5af1: Status 404 returned error can't find the container with id 83106444a5f02e88c0b89d3f69a1b528d5c57e0652836c1a06ffbdae535d5af1 Apr 16 14:52:55.529410 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:52:55.529386 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9613bf50_dbc0_4e6d_aeba_8f63da3babdb.slice/crio-dc5a682e3240109dce49a2b122d2b2621f354c35dc5fc5bf15ebef2f85a45c48 WatchSource:0}: Error finding container dc5a682e3240109dce49a2b122d2b2621f354c35dc5fc5bf15ebef2f85a45c48: Status 404 returned error can't find the container with id dc5a682e3240109dce49a2b122d2b2621f354c35dc5fc5bf15ebef2f85a45c48 Apr 16 14:52:55.830457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.830048 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:47:53 +0000 UTC" deadline="2027-11-06 11:11:11.10000806 +0000 UTC" Apr 16 14:52:55.830457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.830328 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13652h18m15.269685826s" Apr 16 14:52:55.932517 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.932025 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:55.932517 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:55.932144 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:52:55.947867 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.945668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qswpg" event={"ID":"48dde1ba-8911-4c19-9083-79bd3339f3bf","Type":"ContainerStarted","Data":"171e27f4758cc65bb715f7517ceb8106a254b51de70198d6780b3cc8cdf4675c"} Apr 16 14:52:55.947867 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.947715 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" event={"ID":"b533746b-70b1-42cc-ab44-8b3907cf75a3","Type":"ContainerStarted","Data":"51f84ae74dfeffc3da06e400412f84e5eff4a38ae6ecc7abbe2bb622a73b7e6c"} Apr 16 14:52:55.953105 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.952991 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerStarted","Data":"bdb02f74c8dd139509320a63ecc1b196b299e644c1a86e41c4558d3d76d9f36b"} Apr 16 14:52:55.957106 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.956492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qpspz" event={"ID":"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f","Type":"ContainerStarted","Data":"83106444a5f02e88c0b89d3f69a1b528d5c57e0652836c1a06ffbdae535d5af1"} Apr 16 14:52:55.961345 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.961317 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" event={"ID":"7347b36a-63d3-4952-9fcc-7bc501135de9","Type":"ContainerStarted","Data":"58c9fd71fc29fafd2be6371beaff8c672256ee60c7d2e2d75857bf53c9d46c4d"} Apr 16 14:52:55.969422 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.969368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-68rx2" event={"ID":"475f512a-706c-424b-b38f-428bf1b64f69","Type":"ContainerStarted","Data":"597549946becf53d2cba6c445ff3a2a2aec15014999bbc0c8a159a8a29aba697"} Apr 16 14:52:55.973309 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.973260 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"d0ae8e8416dba8535e8e7edf4e2f93bec61a13ae8710c783ac7d468875bc3b43"} Apr 16 14:52:55.998220 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:55.997504 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" event={"ID":"0cc6f13ece139d4b95b6457b753b2eb6","Type":"ContainerStarted","Data":"be4559092063ae7fb2e0a54164477a736d89b13d4a8993f81b67c320c146ae00"} Apr 16 14:52:56.005091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.005047 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9dmjs" event={"ID":"9613bf50-dbc0-4e6d-aeba-8f63da3babdb","Type":"ContainerStarted","Data":"dc5a682e3240109dce49a2b122d2b2621f354c35dc5fc5bf15ebef2f85a45c48"} Apr 16 14:52:56.014209 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.014165 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5dfrs" event={"ID":"70c41107-b96d-429c-a82c-270215f0994f","Type":"ContainerStarted","Data":"a2df23475f1c53c6f75e3fc8299d9b50529f2b315fcb2880b65f0631af2009ca"} Apr 16 14:52:56.167980 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.167830 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:52:56.408366 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.408336 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:56.408516 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.408497 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:56.408587 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.408525 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:56.408587 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.408537 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:56.408691 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.408589 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:58.408571635 +0000 UTC m=+6.061953990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:56.509568 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.509494 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:56.509714 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.509663 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:56.509771 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.509732 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:52:58.509712073 +0000 UTC m=+6.163094428 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:56.933055 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:56.933022 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:56.933502 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:56.933176 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:52:57.037728 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:57.037692 2580 generic.go:358] "Generic (PLEG): container finished" podID="28881687908b35bd40a7465b6e8c523f" containerID="8f750259974c4762af9515008ac717bae7e788fedb2c306a15753b15c0056f30" exitCode=0 Apr 16 14:52:57.038453 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:57.038284 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" event={"ID":"28881687908b35bd40a7465b6e8c523f","Type":"ContainerDied","Data":"8f750259974c4762af9515008ac717bae7e788fedb2c306a15753b15c0056f30"} Apr 16 14:52:57.056803 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:57.054095 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-160.ec2.internal" podStartSLOduration=3.054079196 podStartE2EDuration="3.054079196s" podCreationTimestamp="2026-04-16 14:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:56.012670418 +0000 UTC m=+3.666052806" watchObservedRunningTime="2026-04-16 14:52:57.054079196 +0000 UTC m=+4.707461573" Apr 16 14:52:57.931015 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:57.930976 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:57.931172 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:57.931096 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:52:58.049807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:58.049769 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" event={"ID":"28881687908b35bd40a7465b6e8c523f","Type":"ContainerStarted","Data":"84537f6650b19c2e254c38895675c46cf70979cea6ba7d95e0515a6a919b7790"} Apr 16 14:52:58.067258 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:58.067209 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-160.ec2.internal" podStartSLOduration=4.067192921 podStartE2EDuration="4.067192921s" podCreationTimestamp="2026-04-16 14:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:58.066715529 +0000 UTC m=+5.720097896" watchObservedRunningTime="2026-04-16 14:52:58.067192921 +0000 UTC m=+5.720575297" Apr 16 14:52:58.428149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:58.428103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:58.428319 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.428279 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:52:58.428319 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.428303 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:52:58.428319 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.428313 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:58.428475 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.428369 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:02.428350894 +0000 UTC m=+10.081733269 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:52:58.529108 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:58.529066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:58.537121 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.537095 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:58.537260 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.537182 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:02.537160989 +0000 UTC m=+10.190543361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:52:58.930762 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:58.930730 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:52:58.931006 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:58.930878 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:52:59.931040 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:52:59.931006 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:52:59.931702 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:52:59.931130 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:00.931441 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:00.930959 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:00.931441 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:00.931092 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:01.930973 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:01.930917 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:01.931207 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:01.931049 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:02.464627 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:02.464027 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:02.464627 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.464197 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:53:02.464627 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.464216 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:53:02.464627 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.464228 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:02.464627 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.464281 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.464263426 +0000 UTC m=+18.117645784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:02.565205 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:02.565170 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:02.565357 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.565324 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:02.565425 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.565386 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:10.565367262 +0000 UTC m=+18.218749630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:02.931896 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:02.931865 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:02.932174 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:02.931960 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:03.931070 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:03.931040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:03.931461 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:03.931172 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:04.930744 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:04.930660 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:04.930897 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:04.930798 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:05.931085 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:05.931052 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:05.931536 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:05.931168 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:06.931081 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:06.931021 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:06.931258 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:06.931193 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:07.930510 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:07.930475 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:07.930673 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:07.930593 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:08.930525 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:08.930489 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:08.930976 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:08.930629 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:09.931242 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:09.931209 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:09.931655 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:09.931310 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:10.522368 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:10.522332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:10.522542 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.522520 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:53:10.522611 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.522552 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:53:10.522611 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.522567 2580 projected.go:194] Error preparing data for projected volume kube-api-access-thqdk for pod openshift-network-diagnostics/network-check-target-wvkzh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:10.522702 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.522631 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk podName:f33dcc16-237a-4c31-aca0-f46c9648fc20 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.522612346 +0000 UTC m=+34.175994712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-thqdk" (UniqueName: "kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk") pod "network-check-target-wvkzh" (UID: "f33dcc16-237a-4c31-aca0-f46c9648fc20") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:53:10.623292 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:10.623259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:10.623430 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.623384 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:10.623470 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.623437 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.623424057 +0000 UTC m=+34.276806409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:53:10.930751 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:10.930721 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:10.930951 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:10.930862 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:11.930895 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:11.930865 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:11.931296 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:11.930973 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:12.931732 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:12.931587 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:12.932146 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:12.931802 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:13.074279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.074113 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qpspz" event={"ID":"9ab7c27b-be98-41cf-bbea-2ed5ab71d83f","Type":"ContainerStarted","Data":"2b1057cf08e99bb64cd0a0f9fb38e68435814cb2d955c2850f6238555dd2a639"} Apr 16 14:53:13.075545 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.075513 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" event={"ID":"7347b36a-63d3-4952-9fcc-7bc501135de9","Type":"ContainerStarted","Data":"a321c1bbff5213b4a4d12db184907f0b17e09eee4976c5fae85ee920711d10c1"} Apr 16 14:53:13.076957 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.076919 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-68rx2" event={"ID":"475f512a-706c-424b-b38f-428bf1b64f69","Type":"ContainerStarted","Data":"9d112aa7f97ae15cdc5f9809742e563068aa3259ead8c9621ebbfc4cdd8d0b88"} Apr 16 14:53:13.078716 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.078697 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:53:13.079064 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.079044 2580 generic.go:358] "Generic (PLEG): container finished" podID="8e0c1234-5484-4c8c-9e5f-c0d64478ef21" containerID="45c1703ad8f03acfc77ea08597fb20a35f145b933e94bffd2383baa9bf9698fb" exitCode=1 Apr 16 14:53:13.079132 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.079066 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"1850db3588c056fcd37afa16d8599ec5a223b5d4ed5a5ffd2b3027510f9693d1"} Apr 16 14:53:13.079132 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.079090 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerDied","Data":"45c1703ad8f03acfc77ea08597fb20a35f145b933e94bffd2383baa9bf9698fb"} Apr 16 14:53:13.079132 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.079105 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"a8a3a71d941935ef2fcc4502e19609a1cfae1f15cd5613899a8d7c308938c665"} Apr 16 14:53:13.080760 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.080482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9dmjs" event={"ID":"9613bf50-dbc0-4e6d-aeba-8f63da3babdb","Type":"ContainerStarted","Data":"df2d5a4a3d532e8e77e2447db0f0328d76c335c9b6c1d01a037caaa5465ed48e"} Apr 16 14:53:13.081671 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.081651 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qswpg" event={"ID":"48dde1ba-8911-4c19-9083-79bd3339f3bf","Type":"ContainerStarted","Data":"e8d1b5752bd1338edd4d855b5276cdeda5da6cafb516a4608b0a72bec05b8dd2"} Apr 16 14:53:13.083124 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.083104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" event={"ID":"b533746b-70b1-42cc-ab44-8b3907cf75a3","Type":"ContainerStarted","Data":"99a6653c71ee0eafae195e20d198048eb41bfcf6688a64ec7a3ee509f94f20eb"} Apr 16 14:53:13.084160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.084142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerStarted","Data":"d76656d63fe2d0d01d5179c53c2b943734dc98cab39adb325bb40192551e8509"} Apr 16 14:53:13.089829 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.089791 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qpspz" podStartSLOduration=3.120377764 podStartE2EDuration="20.089779903s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.527832648 +0000 UTC m=+3.181215015" lastFinishedPulling="2026-04-16 14:53:12.497234795 +0000 UTC m=+20.150617154" observedRunningTime="2026-04-16 14:53:13.089212264 +0000 UTC m=+20.742594650" watchObservedRunningTime="2026-04-16 14:53:13.089779903 +0000 UTC m=+20.743162277" Apr 16 14:53:13.103152 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.103113 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qswpg" podStartSLOduration=4.133869748 podStartE2EDuration="21.103100324s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.528019545 +0000 UTC m=+3.181401899" lastFinishedPulling="2026-04-16 14:53:12.497250108 +0000 UTC m=+20.150632475" observedRunningTime="2026-04-16 14:53:13.102553207 +0000 UTC m=+20.755935593" watchObservedRunningTime="2026-04-16 14:53:13.103100324 +0000 UTC m=+20.756482698" Apr 16 14:53:13.131242 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.129221 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-68rx2" podStartSLOduration=3.097173582 podStartE2EDuration="20.129204048s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.52235657 +0000 UTC m=+3.175738937" lastFinishedPulling="2026-04-16 14:53:12.554387038 +0000 UTC m=+20.207769403" observedRunningTime="2026-04-16 14:53:13.128425225 +0000 UTC m=+20.781807596" watchObservedRunningTime="2026-04-16 14:53:13.129204048 +0000 UTC m=+20.782586423" Apr 16 14:53:13.145001 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.144960 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bs8ms" podStartSLOduration=4.166372438 podStartE2EDuration="21.144945432s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.521352746 +0000 UTC m=+3.174735100" lastFinishedPulling="2026-04-16 14:53:12.49992574 +0000 UTC m=+20.153308094" observedRunningTime="2026-04-16 14:53:13.144802512 +0000 UTC m=+20.798184898" watchObservedRunningTime="2026-04-16 14:53:13.144945432 +0000 UTC m=+20.798327807" Apr 16 14:53:13.184759 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.183971 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9dmjs" podStartSLOduration=11.055755435 podStartE2EDuration="20.183953462s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.532967821 +0000 UTC m=+3.186350183" lastFinishedPulling="2026-04-16 14:53:04.661165851 +0000 UTC m=+12.314548210" observedRunningTime="2026-04-16 14:53:13.183684705 +0000 UTC m=+20.837067080" watchObservedRunningTime="2026-04-16 14:53:13.183953462 +0000 UTC m=+20.837335837" Apr 16 14:53:13.478310 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.478229 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:53:13.478950 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.478927 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:53:13.872524 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.872500 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:53:13.930793 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:13.930774 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:13.930892 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:13.930868 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:14.087649 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.087585 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" event={"ID":"7347b36a-63d3-4952-9fcc-7bc501135de9","Type":"ContainerStarted","Data":"d58812b3f0f20437c5599e91dd0466b84c7bce615ce7c050874235cfbdcf4794"} Apr 16 14:53:14.089858 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.089827 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:53:14.090190 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.090168 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"79425b2cda2b754981d92de2dd271cd6c96ecee8750e37d89d299dfe4b34ecb7"} Apr 16 14:53:14.090275 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.090199 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"0d96bea91caa1821375d91f7c22f070a9f077a6baef98e9c0f95922eb42df289"} Apr 16 14:53:14.090275 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.090214 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"3daf032f6ca77d7429c7a730f9aaebb3db4dd6093c900e5340fe402c72f88770"} Apr 16 14:53:14.091357 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.091337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5dfrs" event={"ID":"70c41107-b96d-429c-a82c-270215f0994f","Type":"ContainerStarted","Data":"28e83d0e0620055cff167f69665b422094659bdbee077d47378e7b0f6e18cf34"} Apr 16 14:53:14.092628 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.092607 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="d76656d63fe2d0d01d5179c53c2b943734dc98cab39adb325bb40192551e8509" exitCode=0 Apr 16 14:53:14.092753 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.092730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"d76656d63fe2d0d01d5179c53c2b943734dc98cab39adb325bb40192551e8509"} Apr 16 14:53:14.104430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.104382 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5dfrs" podStartSLOduration=5.141717483 podStartE2EDuration="22.104371222s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.530991282 +0000 UTC m=+3.184373634" lastFinishedPulling="2026-04-16 14:53:12.493645005 +0000 UTC m=+20.147027373" observedRunningTime="2026-04-16 14:53:14.103750289 +0000 UTC m=+21.757132674" watchObservedRunningTime="2026-04-16 14:53:14.104371222 +0000 UTC m=+21.757753596" Apr 16 14:53:14.858081 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.857967 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:53:13.872517082Z","UUID":"906c6604-964d-4234-b728-95c16018263e","Handler":null,"Name":"","Endpoint":""} Apr 16 14:53:14.860608 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.860586 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:53:14.860608 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.860611 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:53:14.931181 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:14.931149 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:14.931323 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:14.931289 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:15.094319 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:15.094300 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:15.931563 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:15.931265 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:15.931710 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:15.931677 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:16.099651 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:16.099620 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:53:16.100091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:16.100059 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"7103188f37c77353108b29d90bbb10bbb40c2d6bbe1dbc3c956912ba57e7dae9"} Apr 16 14:53:16.101773 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:16.101743 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" event={"ID":"7347b36a-63d3-4952-9fcc-7bc501135de9","Type":"ContainerStarted","Data":"fca36173dfa18ec790ec9841d9dc20534e65431408b43e9e75cf02089063b8e4"} Apr 16 14:53:16.118284 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:16.118238 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7zbst" podStartSLOduration=4.604906504 podStartE2EDuration="24.118224784s" podCreationTimestamp="2026-04-16 14:52:52 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.525197263 +0000 UTC m=+3.178579618" lastFinishedPulling="2026-04-16 14:53:15.038515531 +0000 UTC m=+22.691897898" observedRunningTime="2026-04-16 14:53:16.118123322 +0000 UTC m=+23.771505696" watchObservedRunningTime="2026-04-16 14:53:16.118224784 +0000 UTC m=+23.771607160" Apr 16 14:53:16.931216 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:16.931186 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:16.931356 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:16.931303 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:17.930591 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:17.930559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:17.931134 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:17.930680 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:18.931126 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:18.930958 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:18.931644 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:18.931200 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:19.107513 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.107485 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="4f5441973646337417408dae2c40b4ce4ab15f85d08d2036edf6eb4952c7f97c" exitCode=0 Apr 16 14:53:19.107663 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.107562 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"4f5441973646337417408dae2c40b4ce4ab15f85d08d2036edf6eb4952c7f97c"} Apr 16 14:53:19.110705 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.110690 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:53:19.110996 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.110979 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"5c12ecc18a39589ed5bd2f679c3f414be682a6179203d90cdb00b25a3bbe65e7"} Apr 16 14:53:19.111223 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.111208 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:19.111318 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.111231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:19.111410 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.111396 2580 scope.go:117] "RemoveContainer" containerID="45c1703ad8f03acfc77ea08597fb20a35f145b933e94bffd2383baa9bf9698fb" Apr 16 14:53:19.129177 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.129152 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:19.930709 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:19.930679 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:19.930868 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:19.930805 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:20.100561 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.100534 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kgs47"] Apr 16 14:53:20.101009 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.100678 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:20.101009 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:20.100791 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:20.103064 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.103042 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wvkzh"] Apr 16 14:53:20.116001 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.115855 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:53:20.116381 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.116351 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" event={"ID":"8e0c1234-5484-4c8c-9e5f-c0d64478ef21","Type":"ContainerStarted","Data":"5c3a8d37927e5b9806e13b7c321de7903eb6704ad93a501de8c0e7b428c66c92"} Apr 16 14:53:20.116811 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.116768 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:20.118372 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.118350 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="54c4d74202c2ec443814834dfe6348ba4a3f0ac9d5d1894eca32082a5ab353ae" exitCode=0 Apr 16 14:53:20.118460 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.118425 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"54c4d74202c2ec443814834dfe6348ba4a3f0ac9d5d1894eca32082a5ab353ae"} Apr 16 14:53:20.118460 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.118437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:20.118666 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:20.118641 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:20.132759 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.132741 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:20.142040 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:20.142002 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" podStartSLOduration=10.095183894 podStartE2EDuration="27.141990187s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.51791101 +0000 UTC m=+3.171293363" lastFinishedPulling="2026-04-16 14:53:12.564717298 +0000 UTC m=+20.218099656" observedRunningTime="2026-04-16 14:53:20.141592151 +0000 UTC m=+27.794974541" watchObservedRunningTime="2026-04-16 14:53:20.141990187 +0000 UTC m=+27.795372599" Apr 16 14:53:21.121700 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.121672 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="765498d3da596164b2de10ed7cf870ab960e32290e52117cc2786e63fffcd369" exitCode=0 Apr 16 14:53:21.122156 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.121764 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"765498d3da596164b2de10ed7cf870ab960e32290e52117cc2786e63fffcd369"} Apr 16 14:53:21.801343 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.801292 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:53:21.801516 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.801467 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 14:53:21.802461 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.802435 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9dmjs" Apr 16 14:53:21.931288 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.931217 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:21.931437 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:21.931363 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:21.931701 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:21.931682 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:21.931833 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:21.931767 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:23.931418 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:23.931387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:23.931917 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:23.931387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:23.931917 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:23.931504 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wvkzh" podUID="f33dcc16-237a-4c31-aca0-f46c9648fc20" Apr 16 14:53:23.931917 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:23.931597 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:53:25.696464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.696435 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-160.ec2.internal" event="NodeReady" Apr 16 14:53:25.697053 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.696572 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:53:25.737451 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.737424 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dlgsh"] Apr 16 14:53:25.741679 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.741657 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lstqc"] Apr 16 14:53:25.741817 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.741798 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.743922 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.743898 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:53:25.744031 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.743991 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:53:25.744263 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.744246 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:53:25.744799 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.744776 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:25.746561 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.746545 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:53:25.746873 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.746836 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:53:25.747087 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.747071 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:53:25.747174 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.747111 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:53:25.750226 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.750208 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dlgsh"] Apr 16 14:53:25.757573 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.757552 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lstqc"] Apr 16 14:53:25.837234 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837199 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:25.837388 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837269 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89bc5259-a854-4a23-908c-c4af285bd699-config-volume\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.837388 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837291 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.837388 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9hmp\" (UniqueName: \"kubernetes.io/projected/a1c49e18-2e18-4c91-9c90-58bd53f03775-kube-api-access-t9hmp\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:25.837519 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837405 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97m5h\" (UniqueName: \"kubernetes.io/projected/89bc5259-a854-4a23-908c-c4af285bd699-kube-api-access-97m5h\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.837519 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.837444 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bc5259-a854-4a23-908c-c4af285bd699-tmp-dir\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.930812 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.930776 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:25.930989 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.930875 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:25.937081 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.936956 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937328 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937353 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937394 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937397 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-z9txl\"" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937657 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89bc5259-a854-4a23-908c-c4af285bd699-config-volume\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:25.937751 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:25.937799 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.437784875 +0000 UTC m=+34.091167227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.937983 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9hmp\" (UniqueName: \"kubernetes.io/projected/a1c49e18-2e18-4c91-9c90-58bd53f03775-kube-api-access-t9hmp\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.938014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97m5h\" (UniqueName: \"kubernetes.io/projected/89bc5259-a854-4a23-908c-c4af285bd699-kube-api-access-97m5h\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.938034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bc5259-a854-4a23-908c-c4af285bd699-tmp-dir\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.938064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:25.938162 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:25.938192 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:26.438181877 +0000 UTC m=+34.091564229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.938256 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/89bc5259-a854-4a23-908c-c4af285bd699-config-volume\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.938807 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.938496 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/89bc5259-a854-4a23-908c-c4af285bd699-tmp-dir\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.948755 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.948704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97m5h\" (UniqueName: \"kubernetes.io/projected/89bc5259-a854-4a23-908c-c4af285bd699-kube-api-access-97m5h\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:25.949291 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:25.949273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9hmp\" (UniqueName: \"kubernetes.io/projected/a1c49e18-2e18-4c91-9c90-58bd53f03775-kube-api-access-t9hmp\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:26.441588 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.441555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:26.441756 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.441623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:26.441756 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.441700 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:26.441756 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.441739 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:26.441926 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.441767 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:27.441750224 +0000 UTC m=+35.095132577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:26.441926 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.441790 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:27.441776528 +0000 UTC m=+35.095158882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:26.542730 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.542696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:26.545692 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.545669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqdk\" (UniqueName: \"kubernetes.io/projected/f33dcc16-237a-4c31-aca0-f46c9648fc20-kube-api-access-thqdk\") pod \"network-check-target-wvkzh\" (UID: \"f33dcc16-237a-4c31-aca0-f46c9648fc20\") " pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:26.643945 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.643911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:26.644090 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.644014 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:26.644090 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:26.644067 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:58.644052667 +0000 UTC m=+66.297435023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : secret "metrics-daemon-secret" not found Apr 16 14:53:26.841915 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:26.841835 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:27.006272 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:27.004855 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wvkzh"] Apr 16 14:53:27.033625 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:53:27.033565 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33dcc16_237a_4c31_aca0_f46c9648fc20.slice/crio-0a7cefc086fc26ba673d90ed6c094bae4441b11b8db2ec3ee1800501e22dc2b7 WatchSource:0}: Error finding container 0a7cefc086fc26ba673d90ed6c094bae4441b11b8db2ec3ee1800501e22dc2b7: Status 404 returned error can't find the container with id 0a7cefc086fc26ba673d90ed6c094bae4441b11b8db2ec3ee1800501e22dc2b7 Apr 16 14:53:27.133747 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:27.133712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wvkzh" event={"ID":"f33dcc16-237a-4c31-aca0-f46c9648fc20","Type":"ContainerStarted","Data":"0a7cefc086fc26ba673d90ed6c094bae4441b11b8db2ec3ee1800501e22dc2b7"} Apr 16 14:53:27.450317 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:27.450289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:27.450433 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:27.450351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:27.450495 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:27.450427 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:27.450495 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:27.450463 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:27.450495 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:27.450491 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:29.450474794 +0000 UTC m=+37.103857150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:27.450623 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:27.450513 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:29.450497216 +0000 UTC m=+37.103879567 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:28.138600 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:28.138564 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="28b9a57698f6dfc262e7da5b7de21ba79900f718adb89825987231f90ee73a3d" exitCode=0 Apr 16 14:53:28.139100 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:28.138629 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"28b9a57698f6dfc262e7da5b7de21ba79900f718adb89825987231f90ee73a3d"} Apr 16 14:53:29.143295 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:29.143265 2580 generic.go:358] "Generic (PLEG): container finished" podID="4c7680da-cb3c-4ad2-b143-8ff457f88efe" containerID="c0d4e87dcf84f9965d99e48c12c1f6e3b52b68cdc07a424a7206e9ae3efaa063" exitCode=0 Apr 16 14:53:29.143705 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:29.143337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerDied","Data":"c0d4e87dcf84f9965d99e48c12c1f6e3b52b68cdc07a424a7206e9ae3efaa063"} Apr 16 14:53:29.467519 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:29.467435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:29.467657 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:29.467527 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:29.467657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:29.467606 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:29.467657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:29.467636 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:29.467791 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:29.467688 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:33.467667689 +0000 UTC m=+41.121050045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:29.467791 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:29.467708 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:33.467698264 +0000 UTC m=+41.121080623 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:30.148563 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:30.148357 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" event={"ID":"4c7680da-cb3c-4ad2-b143-8ff457f88efe","Type":"ContainerStarted","Data":"d25b88e8058c8bb8860a63712d473d653f777ad0a01922434e610edf0f99fed5"} Apr 16 14:53:30.171615 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:30.171566 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w4dhd" podStartSLOduration=5.603127439 podStartE2EDuration="37.171546201s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:52:55.518186921 +0000 UTC m=+3.171569275" lastFinishedPulling="2026-04-16 14:53:27.086605672 +0000 UTC m=+34.739988037" observedRunningTime="2026-04-16 14:53:30.169211931 +0000 UTC m=+37.822594306" watchObservedRunningTime="2026-04-16 14:53:30.171546201 +0000 UTC m=+37.824928576" Apr 16 14:53:31.153060 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:31.153022 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wvkzh" event={"ID":"f33dcc16-237a-4c31-aca0-f46c9648fc20","Type":"ContainerStarted","Data":"f2e2f6fc72f1f779808ad921defc63959adfc303125aabd9c8e0061c6a904224"} Apr 16 14:53:31.153598 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:31.153450 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:53:31.169694 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:31.169651 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wvkzh" podStartSLOduration=35.18707805 podStartE2EDuration="38.169637383s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:53:27.063948633 +0000 UTC m=+34.717330985" lastFinishedPulling="2026-04-16 14:53:30.046507963 +0000 UTC m=+37.699890318" observedRunningTime="2026-04-16 14:53:31.169055095 +0000 UTC m=+38.822437470" watchObservedRunningTime="2026-04-16 14:53:31.169637383 +0000 UTC m=+38.823019757" Apr 16 14:53:33.492367 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:33.492335 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:33.492887 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:33.492379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:33.492887 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:33.492464 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:33.492887 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:33.492490 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:33.492887 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:33.492519 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:41.492506799 +0000 UTC m=+49.145889150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:33.492887 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:33.492561 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:41.492543109 +0000 UTC m=+49.145925479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:41.545577 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:41.545543 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:41.546030 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:41.545595 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:41.546030 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:41.545684 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:41.546030 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:41.545689 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:41.546030 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:41.545735 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:57.545722915 +0000 UTC m=+65.199105266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:41.546030 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:41.545747 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:53:57.545741932 +0000 UTC m=+65.199124283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:52.134993 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:52.134967 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-czzzx" Apr 16 14:53:57.641225 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:57.641185 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:53:57.641225 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:57.641242 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:53:57.641643 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:57.641329 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:53:57.641643 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:57.641332 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:53:57.641643 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:57.641382 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:29.641368472 +0000 UTC m=+97.294750824 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:53:57.641643 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:57.641400 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:29.641386983 +0000 UTC m=+97.294769335 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:53:58.645994 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:53:58.645963 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:53:58.646358 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:58.646083 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:53:58.646358 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:53:58.646146 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.64613188 +0000 UTC m=+130.299514231 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : secret "metrics-daemon-secret" not found Apr 16 14:54:03.158465 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:03.158354 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wvkzh" Apr 16 14:54:29.740319 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:29.740284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:54:29.740657 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:29.740339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:54:29.740657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:29.740411 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:54:29.740657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:29.740426 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:54:29.740657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:29.740477 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls podName:89bc5259-a854-4a23-908c-c4af285bd699 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:33.740463155 +0000 UTC m=+161.393845507 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls") pod "dns-default-dlgsh" (UID: "89bc5259-a854-4a23-908c-c4af285bd699") : secret "dns-default-metrics-tls" not found Apr 16 14:54:29.740657 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:29.740492 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert podName:a1c49e18-2e18-4c91-9c90-58bd53f03775 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:33.740485635 +0000 UTC m=+161.393867986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert") pod "ingress-canary-lstqc" (UID: "a1c49e18-2e18-4c91-9c90-58bd53f03775") : secret "canary-serving-cert" not found Apr 16 14:54:50.272033 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.272000 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn"] Apr 16 14:54:50.274769 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.274752 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.277878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.277862 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:54:50.278522 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.278506 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:54:50.278522 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.278518 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2b4t9\"" Apr 16 14:54:50.278666 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.278626 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:54:50.278721 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.278682 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:54:50.284517 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.284498 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn"] Apr 16 14:54:50.369551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.369524 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.369691 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.369584 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5d7d\" (UniqueName: \"kubernetes.io/projected/d6202eab-2202-486c-9663-9a51687b0dc8-kube-api-access-l5d7d\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.369691 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.369634 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6202eab-2202-486c-9663-9a51687b0dc8-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.372576 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.372548 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-tx72c"] Apr 16 14:54:50.375141 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.375127 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.377234 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.377218 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2hq7g\"" Apr 16 14:54:50.377345 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.377220 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:54:50.377345 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.377222 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:54:50.377880 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.377611 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:54:50.377880 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.377576 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:54:50.384098 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.384080 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:54:50.386157 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.386138 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-tx72c"] Apr 16 14:54:50.470190 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8bn\" (UniqueName: \"kubernetes.io/projected/9915bbf3-08d3-4eb4-b977-389f37e66425-kube-api-access-pp8bn\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470289 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470211 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.470289 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470237 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-snapshots\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470289 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470289 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470282 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5d7d\" (UniqueName: \"kubernetes.io/projected/d6202eab-2202-486c-9663-9a51687b0dc8-kube-api-access-l5d7d\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6202eab-2202-486c-9663-9a51687b0dc8-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-tmp\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:50.470329 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470344 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.470358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9915bbf3-08d3-4eb4-b977-389f37e66425-serving-cert\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.470457 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:50.470383 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:50.970365155 +0000 UTC m=+118.623747537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:50.471077 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.471056 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d6202eab-2202-486c-9663-9a51687b0dc8-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.480272 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.480245 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5d7d\" (UniqueName: \"kubernetes.io/projected/d6202eab-2202-486c-9663-9a51687b0dc8-kube-api-access-l5d7d\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.571494 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571435 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.571494 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9915bbf3-08d3-4eb4-b977-389f37e66425-serving-cert\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.571494 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8bn\" (UniqueName: \"kubernetes.io/projected/9915bbf3-08d3-4eb4-b977-389f37e66425-kube-api-access-pp8bn\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.571673 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-snapshots\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.571708 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.571771 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.571757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-tmp\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.572217 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.572164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-tmp\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.572217 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.572174 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.572217 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.572232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9915bbf3-08d3-4eb4-b977-389f37e66425-snapshots\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.572487 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.572330 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9915bbf3-08d3-4eb4-b977-389f37e66425-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.573805 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.573786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9915bbf3-08d3-4eb4-b977-389f37e66425-serving-cert\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.580589 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.580567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8bn\" (UniqueName: \"kubernetes.io/projected/9915bbf3-08d3-4eb4-b977-389f37e66425-kube-api-access-pp8bn\") pod \"insights-operator-5785d4fcdd-tx72c\" (UID: \"9915bbf3-08d3-4eb4-b977-389f37e66425\") " pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.685533 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.685503 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" Apr 16 14:54:50.820905 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.820878 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-tx72c"] Apr 16 14:54:50.823877 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:54:50.823825 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9915bbf3_08d3_4eb4_b977_389f37e66425.slice/crio-1ceaf641f2fce01c1aaaa6a824ba22c550f8738244c9b735b9872597b9a3d09b WatchSource:0}: Error finding container 1ceaf641f2fce01c1aaaa6a824ba22c550f8738244c9b735b9872597b9a3d09b: Status 404 returned error can't find the container with id 1ceaf641f2fce01c1aaaa6a824ba22c550f8738244c9b735b9872597b9a3d09b Apr 16 14:54:50.974893 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:50.974863 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:50.974991 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:50.974943 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:50.975029 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:50.975003 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:51.97498522 +0000 UTC m=+119.628367575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:51.295022 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:51.294988 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" event={"ID":"9915bbf3-08d3-4eb4-b977-389f37e66425","Type":"ContainerStarted","Data":"1ceaf641f2fce01c1aaaa6a824ba22c550f8738244c9b735b9872597b9a3d09b"} Apr 16 14:54:51.982667 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:51.982631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:51.982884 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:51.982803 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:51.982954 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:51.982916 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:53.98289522 +0000 UTC m=+121.636277593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:53.299793 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:53.299753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" event={"ID":"9915bbf3-08d3-4eb4-b977-389f37e66425","Type":"ContainerStarted","Data":"2b55ed2e3ff1817cd041820aa6d0419a81465872f73f68501c3e7c9f0ba0d3a2"} Apr 16 14:54:53.316106 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:53.316058 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" podStartSLOduration=1.424868619 podStartE2EDuration="3.316046259s" podCreationTimestamp="2026-04-16 14:54:50 +0000 UTC" firstStartedPulling="2026-04-16 14:54:50.825535506 +0000 UTC m=+118.478917857" lastFinishedPulling="2026-04-16 14:54:52.716713142 +0000 UTC m=+120.370095497" observedRunningTime="2026-04-16 14:54:53.315633709 +0000 UTC m=+120.969016080" watchObservedRunningTime="2026-04-16 14:54:53.316046259 +0000 UTC m=+120.969428633" Apr 16 14:54:53.996084 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:53.996042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:53.996251 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:53.996182 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:53.996292 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:53.996251 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:54:57.996233854 +0000 UTC m=+125.649616209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:56.197178 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:56.197153 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qswpg_48dde1ba-8911-4c19-9083-79bd3339f3bf/dns-node-resolver/0.log" Apr 16 14:54:57.196444 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:57.196415 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qpspz_9ab7c27b-be98-41cf-bbea-2ed5ab71d83f/node-ca/0.log" Apr 16 14:54:58.025593 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:54:58.025559 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:54:58.029405 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:58.029377 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:54:58.029524 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:54:58.029463 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:06.029441578 +0000 UTC m=+133.682823947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:00.342238 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.342206 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn"] Apr 16 14:55:00.344945 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.344928 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.347192 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.347167 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.348018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.347993 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-kw5lz\"" Apr 16 14:55:00.348118 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.348025 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:55:00.348118 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.348034 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.348118 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.348025 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:55:00.354804 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.354783 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn"] Apr 16 14:55:00.445464 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.445442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.445583 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.445487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.445583 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.445570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pxr\" (UniqueName: \"kubernetes.io/projected/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-kube-api-access-f7pxr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.449806 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.449789 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jxm62"] Apr 16 14:55:00.452482 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.452468 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.454607 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.454588 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kt88t\"" Apr 16 14:55:00.454710 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.454623 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 14:55:00.454777 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.454747 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.454871 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.454774 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 14:55:00.455007 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.454995 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.459705 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.459687 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 14:55:00.461877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.461858 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jxm62"] Apr 16 14:55:00.546469 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546449 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxpm\" (UniqueName: \"kubernetes.io/projected/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-kube-api-access-5xxpm\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.546575 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546486 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pxr\" (UniqueName: \"kubernetes.io/projected/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-kube-api-access-f7pxr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.546575 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546513 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-serving-cert\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.546644 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.546644 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.546714 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-trusted-ca\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.546714 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.546699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-config\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.547170 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.547152 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.548863 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.548831 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.554096 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.554078 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5"] Apr 16 14:55:00.556946 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.556932 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.557126 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.557102 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5"] Apr 16 14:55:00.559135 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.559117 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hgllk\"" Apr 16 14:55:00.559135 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.559128 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:55:00.559271 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.559129 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.559428 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.559410 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.559917 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.559727 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.560872 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.560832 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf"] Apr 16 14:55:00.564463 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.564443 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" Apr 16 14:55:00.566404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.565712 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:55:00.566404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.565728 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.566404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.565802 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.566404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.565875 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-rl5ns\"" Apr 16 14:55:00.566404 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.566039 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:55:00.566948 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.566633 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:55:00.567001 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.566945 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-lzm4b\"" Apr 16 14:55:00.567242 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.567225 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:55:00.568751 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.568732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pxr\" (UniqueName: \"kubernetes.io/projected/2ff993dc-1d95-4aaf-b8a3-233fbf6081af-kube-api-access-f7pxr\") pod \"kube-storage-version-migrator-operator-756bb7d76f-pfnkn\" (UID: \"2ff993dc-1d95-4aaf-b8a3-233fbf6081af\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.569088 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.569069 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5"] Apr 16 14:55:00.570918 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.570899 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5"] Apr 16 14:55:00.573809 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.573789 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf"] Apr 16 14:55:00.647090 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647068 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-serving-cert\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.647205 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59faa17e-f4a6-43d2-97a3-9144f068504f-serving-cert\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.647205 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647124 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r68d\" (UniqueName: \"kubernetes.io/projected/4f67a966-a577-4b75-926c-c578f1df2f3a-kube-api-access-8r68d\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.647205 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647140 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrn92\" (UniqueName: \"kubernetes.io/projected/59faa17e-f4a6-43d2-97a3-9144f068504f-kube-api-access-rrn92\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.647205 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647164 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-526fh\" (UniqueName: \"kubernetes.io/projected/7221e3d8-8a94-4e43-88a3-0261e15e31c2-kube-api-access-526fh\") pod \"volume-data-source-validator-7d955d5dd4-hh9lf\" (UID: \"7221e3d8-8a94-4e43-88a3-0261e15e31c2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" Apr 16 14:55:00.647340 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647307 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-trusted-ca\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.647374 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647352 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-config\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.647406 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647387 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59faa17e-f4a6-43d2-97a3-9144f068504f-config\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.647439 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647419 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.647509 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xxpm\" (UniqueName: \"kubernetes.io/projected/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-kube-api-access-5xxpm\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.648012 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.647992 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-config\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.648220 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.648204 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-trusted-ca\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.649784 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.649760 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-serving-cert\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.653092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.653069 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" Apr 16 14:55:00.655487 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.655459 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xxpm\" (UniqueName: \"kubernetes.io/projected/f5d0644e-6880-4f5d-8d37-6b6693b0bfea-kube-api-access-5xxpm\") pod \"console-operator-d87b8d5fc-jxm62\" (UID: \"f5d0644e-6880-4f5d-8d37-6b6693b0bfea\") " pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.748279 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748249 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.748396 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748339 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59faa17e-f4a6-43d2-97a3-9144f068504f-serving-cert\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.748396 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748369 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r68d\" (UniqueName: \"kubernetes.io/projected/4f67a966-a577-4b75-926c-c578f1df2f3a-kube-api-access-8r68d\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.748396 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748391 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrn92\" (UniqueName: \"kubernetes.io/projected/59faa17e-f4a6-43d2-97a3-9144f068504f-kube-api-access-rrn92\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.748515 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:00.748394 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:00.748515 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:00.748455 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls podName:4f67a966-a577-4b75-926c-c578f1df2f3a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:01.248441653 +0000 UTC m=+128.901824005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6m7v5" (UID: "4f67a966-a577-4b75-926c-c578f1df2f3a") : secret "samples-operator-tls" not found Apr 16 14:55:00.748515 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-526fh\" (UniqueName: \"kubernetes.io/projected/7221e3d8-8a94-4e43-88a3-0261e15e31c2-kube-api-access-526fh\") pod \"volume-data-source-validator-7d955d5dd4-hh9lf\" (UID: \"7221e3d8-8a94-4e43-88a3-0261e15e31c2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" Apr 16 14:55:00.748666 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.748586 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59faa17e-f4a6-43d2-97a3-9144f068504f-config\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.749101 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.749077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59faa17e-f4a6-43d2-97a3-9144f068504f-config\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.750915 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.750889 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59faa17e-f4a6-43d2-97a3-9144f068504f-serving-cert\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.761146 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.761123 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:00.763618 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.763589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-526fh\" (UniqueName: \"kubernetes.io/projected/7221e3d8-8a94-4e43-88a3-0261e15e31c2-kube-api-access-526fh\") pod \"volume-data-source-validator-7d955d5dd4-hh9lf\" (UID: \"7221e3d8-8a94-4e43-88a3-0261e15e31c2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" Apr 16 14:55:00.763705 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.763625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrn92\" (UniqueName: \"kubernetes.io/projected/59faa17e-f4a6-43d2-97a3-9144f068504f-kube-api-access-rrn92\") pod \"service-ca-operator-69965bb79d-ldsf5\" (UID: \"59faa17e-f4a6-43d2-97a3-9144f068504f\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.764190 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.764168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r68d\" (UniqueName: \"kubernetes.io/projected/4f67a966-a577-4b75-926c-c578f1df2f3a-kube-api-access-8r68d\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:00.773832 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.773810 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn"] Apr 16 14:55:00.776665 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:00.776636 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ff993dc_1d95_4aaf_b8a3_233fbf6081af.slice/crio-65609a2e9a7bedbb6ba56cbccf3b5b9657367901ef680f9f5ebc851126421a13 WatchSource:0}: Error finding container 65609a2e9a7bedbb6ba56cbccf3b5b9657367901ef680f9f5ebc851126421a13: Status 404 returned error can't find the container with id 65609a2e9a7bedbb6ba56cbccf3b5b9657367901ef680f9f5ebc851126421a13 Apr 16 14:55:00.869575 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.869547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" Apr 16 14:55:00.872968 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.872945 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-jxm62"] Apr 16 14:55:00.875763 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:00.875737 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5d0644e_6880_4f5d_8d37_6b6693b0bfea.slice/crio-48124e35572fb51cf1bb307a515e5fc4320659600c802ffd607a2e0b71d79f66 WatchSource:0}: Error finding container 48124e35572fb51cf1bb307a515e5fc4320659600c802ffd607a2e0b71d79f66: Status 404 returned error can't find the container with id 48124e35572fb51cf1bb307a515e5fc4320659600c802ffd607a2e0b71d79f66 Apr 16 14:55:00.887223 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.887199 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" Apr 16 14:55:00.993453 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:00.993426 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5"] Apr 16 14:55:00.996949 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:00.996916 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59faa17e_f4a6_43d2_97a3_9144f068504f.slice/crio-aa7a702dc704f5d05da0d3d6ebfe2739e4813ccfaab1324a098e9d4f1b2dd59b WatchSource:0}: Error finding container aa7a702dc704f5d05da0d3d6ebfe2739e4813ccfaab1324a098e9d4f1b2dd59b: Status 404 returned error can't find the container with id aa7a702dc704f5d05da0d3d6ebfe2739e4813ccfaab1324a098e9d4f1b2dd59b Apr 16 14:55:01.007298 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.007274 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf"] Apr 16 14:55:01.009949 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:01.009928 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7221e3d8_8a94_4e43_88a3_0261e15e31c2.slice/crio-96be884c837651d44b83a90ac2f0a7006efb229fbf4b7c717992a997547528f7 WatchSource:0}: Error finding container 96be884c837651d44b83a90ac2f0a7006efb229fbf4b7c717992a997547528f7: Status 404 returned error can't find the container with id 96be884c837651d44b83a90ac2f0a7006efb229fbf4b7c717992a997547528f7 Apr 16 14:55:01.252055 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.251978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:01.252212 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:01.252142 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:01.252270 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:01.252218 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls podName:4f67a966-a577-4b75-926c-c578f1df2f3a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:02.252197932 +0000 UTC m=+129.905580289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6m7v5" (UID: "4f67a966-a577-4b75-926c-c578f1df2f3a") : secret "samples-operator-tls" not found Apr 16 14:55:01.314429 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.314401 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" event={"ID":"2ff993dc-1d95-4aaf-b8a3-233fbf6081af","Type":"ContainerStarted","Data":"65609a2e9a7bedbb6ba56cbccf3b5b9657367901ef680f9f5ebc851126421a13"} Apr 16 14:55:01.315329 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.315291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" event={"ID":"59faa17e-f4a6-43d2-97a3-9144f068504f","Type":"ContainerStarted","Data":"aa7a702dc704f5d05da0d3d6ebfe2739e4813ccfaab1324a098e9d4f1b2dd59b"} Apr 16 14:55:01.316144 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.316125 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" event={"ID":"7221e3d8-8a94-4e43-88a3-0261e15e31c2","Type":"ContainerStarted","Data":"96be884c837651d44b83a90ac2f0a7006efb229fbf4b7c717992a997547528f7"} Apr 16 14:55:01.317044 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:01.317021 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" event={"ID":"f5d0644e-6880-4f5d-8d37-6b6693b0bfea","Type":"ContainerStarted","Data":"48124e35572fb51cf1bb307a515e5fc4320659600c802ffd607a2e0b71d79f66"} Apr 16 14:55:02.261808 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:02.261770 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:02.262269 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:02.261921 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:02.262269 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:02.262000 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls podName:4f67a966-a577-4b75-926c-c578f1df2f3a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:04.261977277 +0000 UTC m=+131.915359629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6m7v5" (UID: "4f67a966-a577-4b75-926c-c578f1df2f3a") : secret "samples-operator-tls" not found Apr 16 14:55:02.665388 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:02.665347 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:55:02.665574 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:02.665517 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:55:02.665637 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:02.665593 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs podName:5c30c303-f0bf-425c-bb3f-ce75dde11fe3 nodeName:}" failed. No retries permitted until 2026-04-16 14:57:04.66557251 +0000 UTC m=+252.318954862 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs") pod "network-metrics-daemon-kgs47" (UID: "5c30c303-f0bf-425c-bb3f-ce75dde11fe3") : secret "metrics-daemon-secret" not found Apr 16 14:55:04.278545 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.278459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:04.278913 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:04.278626 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:04.278913 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:04.278707 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls podName:4f67a966-a577-4b75-926c-c578f1df2f3a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.278686069 +0000 UTC m=+135.932068427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6m7v5" (UID: "4f67a966-a577-4b75-926c-c578f1df2f3a") : secret "samples-operator-tls" not found Apr 16 14:55:04.325564 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.325530 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" event={"ID":"59faa17e-f4a6-43d2-97a3-9144f068504f","Type":"ContainerStarted","Data":"80f4b39c39299842e19d30b2caf95d31a7f892538e162c8ca02eb03f76cb20ae"} Apr 16 14:55:04.326861 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.326814 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" event={"ID":"7221e3d8-8a94-4e43-88a3-0261e15e31c2","Type":"ContainerStarted","Data":"04a69d833ac6d053c31de4e5c178f2e9de26d484a14f77e1899f0b6b603d5665"} Apr 16 14:55:04.328398 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.328377 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/0.log" Apr 16 14:55:04.328498 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.328413 2580 generic.go:358] "Generic (PLEG): container finished" podID="f5d0644e-6880-4f5d-8d37-6b6693b0bfea" containerID="cc027b136adba8b24a577a0af4286ce107a2235cac6f0368c6f93647ca4bd66e" exitCode=255 Apr 16 14:55:04.328567 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.328500 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" event={"ID":"f5d0644e-6880-4f5d-8d37-6b6693b0bfea","Type":"ContainerDied","Data":"cc027b136adba8b24a577a0af4286ce107a2235cac6f0368c6f93647ca4bd66e"} Apr 16 14:55:04.328735 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.328717 2580 scope.go:117] "RemoveContainer" containerID="cc027b136adba8b24a577a0af4286ce107a2235cac6f0368c6f93647ca4bd66e" Apr 16 14:55:04.329878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.329855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" event={"ID":"2ff993dc-1d95-4aaf-b8a3-233fbf6081af","Type":"ContainerStarted","Data":"9976fcf586db8201915d262cd810becc94d782d34d7c893e0bcb65b98f3b93f6"} Apr 16 14:55:04.342724 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.342681 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" podStartSLOduration=1.385838251 podStartE2EDuration="4.342666721s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:00.998892215 +0000 UTC m=+128.652274571" lastFinishedPulling="2026-04-16 14:55:03.955720689 +0000 UTC m=+131.609103041" observedRunningTime="2026-04-16 14:55:04.34151091 +0000 UTC m=+131.994893285" watchObservedRunningTime="2026-04-16 14:55:04.342666721 +0000 UTC m=+131.996049097" Apr 16 14:55:04.375373 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.375299 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-hh9lf" podStartSLOduration=1.4364919 podStartE2EDuration="4.375282586s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:01.011541186 +0000 UTC m=+128.664923538" lastFinishedPulling="2026-04-16 14:55:03.950331868 +0000 UTC m=+131.603714224" observedRunningTime="2026-04-16 14:55:04.374471151 +0000 UTC m=+132.027853526" watchObservedRunningTime="2026-04-16 14:55:04.375282586 +0000 UTC m=+132.028664961" Apr 16 14:55:04.392314 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:04.392266 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" podStartSLOduration=1.220355189 podStartE2EDuration="4.392249396s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:00.778527828 +0000 UTC m=+128.431910184" lastFinishedPulling="2026-04-16 14:55:03.950422034 +0000 UTC m=+131.603804391" observedRunningTime="2026-04-16 14:55:04.391245358 +0000 UTC m=+132.044627729" watchObservedRunningTime="2026-04-16 14:55:04.392249396 +0000 UTC m=+132.045631771" Apr 16 14:55:05.336989 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.336922 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 14:55:05.337346 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.337306 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/0.log" Apr 16 14:55:05.337346 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.337338 2580 generic.go:358] "Generic (PLEG): container finished" podID="f5d0644e-6880-4f5d-8d37-6b6693b0bfea" containerID="81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086" exitCode=255 Apr 16 14:55:05.337470 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.337446 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" event={"ID":"f5d0644e-6880-4f5d-8d37-6b6693b0bfea","Type":"ContainerDied","Data":"81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086"} Apr 16 14:55:05.337511 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.337488 2580 scope.go:117] "RemoveContainer" containerID="cc027b136adba8b24a577a0af4286ce107a2235cac6f0368c6f93647ca4bd66e" Apr 16 14:55:05.337770 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:05.337752 2580 scope.go:117] "RemoveContainer" containerID="81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086" Apr 16 14:55:05.337990 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:05.337967 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jxm62_openshift-console-operator(f5d0644e-6880-4f5d-8d37-6b6693b0bfea)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" podUID="f5d0644e-6880-4f5d-8d37-6b6693b0bfea" Apr 16 14:55:06.095434 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:06.095395 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:55:06.095602 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:06.095535 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:06.095602 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:06.095597 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls podName:d6202eab-2202-486c-9663-9a51687b0dc8 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:22.095582194 +0000 UTC m=+149.748964549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-4v4pn" (UID: "d6202eab-2202-486c-9663-9a51687b0dc8") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:55:06.341409 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:06.341381 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 14:55:06.341796 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:06.341670 2580 scope.go:117] "RemoveContainer" containerID="81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086" Apr 16 14:55:06.341859 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:06.341820 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jxm62_openshift-console-operator(f5d0644e-6880-4f5d-8d37-6b6693b0bfea)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" podUID="f5d0644e-6880-4f5d-8d37-6b6693b0bfea" Apr 16 14:55:07.167308 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.167277 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-txqk5"] Apr 16 14:55:07.171493 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.171477 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.173997 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.173963 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:55:07.174121 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.173963 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:55:07.174121 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.174013 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:55:07.174791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.174775 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vxvvh\"" Apr 16 14:55:07.174913 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.174865 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:55:07.180213 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.180189 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-txqk5"] Apr 16 14:55:07.304724 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.304689 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-key\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.304918 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.304731 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-cabundle\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.304918 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.304755 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq4jc\" (UniqueName: \"kubernetes.io/projected/f867c529-5242-4b2c-9034-9b44df2b5ba8-kube-api-access-nq4jc\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.405680 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.405650 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-key\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.405680 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.405685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-cabundle\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.406115 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.405709 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq4jc\" (UniqueName: \"kubernetes.io/projected/f867c529-5242-4b2c-9034-9b44df2b5ba8-kube-api-access-nq4jc\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.407358 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.407338 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-cabundle\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.408189 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.408172 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f867c529-5242-4b2c-9034-9b44df2b5ba8-signing-key\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.413818 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.413788 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq4jc\" (UniqueName: \"kubernetes.io/projected/f867c529-5242-4b2c-9034-9b44df2b5ba8-kube-api-access-nq4jc\") pod \"service-ca-bfc587fb7-txqk5\" (UID: \"f867c529-5242-4b2c-9034-9b44df2b5ba8\") " pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.480091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.480021 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" Apr 16 14:55:07.597445 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.597396 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-txqk5"] Apr 16 14:55:07.600713 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:07.600687 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf867c529_5242_4b2c_9034_9b44df2b5ba8.slice/crio-145fdf6770023f796b21e2a75d2d4c52063a949ca4129fd408a4b3f601ab8baf WatchSource:0}: Error finding container 145fdf6770023f796b21e2a75d2d4c52063a949ca4129fd408a4b3f601ab8baf: Status 404 returned error can't find the container with id 145fdf6770023f796b21e2a75d2d4c52063a949ca4129fd408a4b3f601ab8baf Apr 16 14:55:07.676231 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.676207 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8j958"] Apr 16 14:55:07.679450 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.679429 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.682726 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.682707 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:55:07.682982 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.682969 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2b2x9\"" Apr 16 14:55:07.683143 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.683115 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:55:07.697414 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.697393 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8j958"] Apr 16 14:55:07.809491 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.809422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.809491 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.809460 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqc72\" (UniqueName: \"kubernetes.io/projected/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-api-access-tqc72\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.809639 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.809497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.809639 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.809578 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b8ad91f-8e14-47f4-b2ae-495edea3e670-data-volume\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.809639 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.809605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b8ad91f-8e14-47f4-b2ae-495edea3e670-crio-socket\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910482 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910451 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910642 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910492 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqc72\" (UniqueName: \"kubernetes.io/projected/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-api-access-tqc72\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910642 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910528 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910642 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:07.910618 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.910774 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:07.910688 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls podName:9b8ad91f-8e14-47f4-b2ae-495edea3e670 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:08.410670118 +0000 UTC m=+136.064052478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8j958" (UID: "9b8ad91f-8e14-47f4-b2ae-495edea3e670") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:07.910774 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b8ad91f-8e14-47f4-b2ae-495edea3e670-data-volume\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910774 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910763 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b8ad91f-8e14-47f4-b2ae-495edea3e670-crio-socket\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.910963 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.910918 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9b8ad91f-8e14-47f4-b2ae-495edea3e670-crio-socket\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.911050 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.911031 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9b8ad91f-8e14-47f4-b2ae-495edea3e670-data-volume\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.911088 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.911052 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:07.953008 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:07.952979 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqc72\" (UniqueName: \"kubernetes.io/projected/9b8ad91f-8e14-47f4-b2ae-495edea3e670-kube-api-access-tqc72\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:08.313934 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:08.313898 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:08.314094 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:08.314014 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:55:08.314094 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:08.314071 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls podName:4f67a966-a577-4b75-926c-c578f1df2f3a nodeName:}" failed. No retries permitted until 2026-04-16 14:55:16.314055221 +0000 UTC m=+143.967437576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls") pod "cluster-samples-operator-667775844f-6m7v5" (UID: "4f67a966-a577-4b75-926c-c578f1df2f3a") : secret "samples-operator-tls" not found Apr 16 14:55:08.351602 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:08.351568 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" event={"ID":"f867c529-5242-4b2c-9034-9b44df2b5ba8","Type":"ContainerStarted","Data":"285f2ffa555ee570f14eb75284c332fae71e6296852a297979b08e6598111e68"} Apr 16 14:55:08.351602 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:08.351602 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" event={"ID":"f867c529-5242-4b2c-9034-9b44df2b5ba8","Type":"ContainerStarted","Data":"145fdf6770023f796b21e2a75d2d4c52063a949ca4129fd408a4b3f601ab8baf"} Apr 16 14:55:08.370480 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:08.370437 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-txqk5" podStartSLOduration=1.370423437 podStartE2EDuration="1.370423437s" podCreationTimestamp="2026-04-16 14:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:08.369734163 +0000 UTC m=+136.023116536" watchObservedRunningTime="2026-04-16 14:55:08.370423437 +0000 UTC m=+136.023805860" Apr 16 14:55:08.414444 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:08.414419 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:08.414779 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:08.414595 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:08.414779 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:08.414670 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls podName:9b8ad91f-8e14-47f4-b2ae-495edea3e670 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:09.414649246 +0000 UTC m=+137.068031603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8j958" (UID: "9b8ad91f-8e14-47f4-b2ae-495edea3e670") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.421995 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:09.421955 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:09.422430 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:09.422072 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:09.422430 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:09.422138 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls podName:9b8ad91f-8e14-47f4-b2ae-495edea3e670 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:11.422118164 +0000 UTC m=+139.075500519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8j958" (UID: "9b8ad91f-8e14-47f4-b2ae-495edea3e670") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:10.761766 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:10.761724 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:10.761766 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:10.761757 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:10.762287 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:10.762190 2580 scope.go:117] "RemoveContainer" containerID="81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086" Apr 16 14:55:10.762425 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:10.762402 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-jxm62_openshift-console-operator(f5d0644e-6880-4f5d-8d37-6b6693b0bfea)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" podUID="f5d0644e-6880-4f5d-8d37-6b6693b0bfea" Apr 16 14:55:11.439666 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:11.439630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:11.439830 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:11.439744 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 14:55:11.439830 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:11.439799 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls podName:9b8ad91f-8e14-47f4-b2ae-495edea3e670 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:15.439784203 +0000 UTC m=+143.093166558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls") pod "insights-runtime-extractor-8j958" (UID: "9b8ad91f-8e14-47f4-b2ae-495edea3e670") : secret "insights-runtime-extractor-tls" not found Apr 16 14:55:15.471722 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:15.471687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:15.474263 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:15.474238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9b8ad91f-8e14-47f4-b2ae-495edea3e670-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8j958\" (UID: \"9b8ad91f-8e14-47f4-b2ae-495edea3e670\") " pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:15.488910 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:15.488888 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8j958" Apr 16 14:55:15.602432 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:15.602401 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8j958"] Apr 16 14:55:15.607471 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:15.607442 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8ad91f_8e14_47f4_b2ae_495edea3e670.slice/crio-0b3b4328c2da8dd1af2b4a30dd2978beff6fa1626c9686850b4c63cd1ca27e1a WatchSource:0}: Error finding container 0b3b4328c2da8dd1af2b4a30dd2978beff6fa1626c9686850b4c63cd1ca27e1a: Status 404 returned error can't find the container with id 0b3b4328c2da8dd1af2b4a30dd2978beff6fa1626c9686850b4c63cd1ca27e1a Apr 16 14:55:16.370230 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.370200 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8j958" event={"ID":"9b8ad91f-8e14-47f4-b2ae-495edea3e670","Type":"ContainerStarted","Data":"4b78a95fb976bf7dd1dc2cf0a5ceb02a1f006fd49c79044da46e2bc2e340a02e"} Apr 16 14:55:16.370318 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.370241 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8j958" event={"ID":"9b8ad91f-8e14-47f4-b2ae-495edea3e670","Type":"ContainerStarted","Data":"0b3b4328c2da8dd1af2b4a30dd2978beff6fa1626c9686850b4c63cd1ca27e1a"} Apr 16 14:55:16.378981 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.378931 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:16.381296 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.381273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f67a966-a577-4b75-926c-c578f1df2f3a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-6m7v5\" (UID: \"4f67a966-a577-4b75-926c-c578f1df2f3a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:16.480942 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.480919 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" Apr 16 14:55:16.605466 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:16.605439 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5"] Apr 16 14:55:17.375129 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:17.375089 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8j958" event={"ID":"9b8ad91f-8e14-47f4-b2ae-495edea3e670","Type":"ContainerStarted","Data":"929f5eb4c520116dc484d2075f18d4e9bcc9b1fa5276c70f982ff8388e520230"} Apr 16 14:55:17.376312 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:17.376280 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" event={"ID":"4f67a966-a577-4b75-926c-c578f1df2f3a","Type":"ContainerStarted","Data":"ab6653d9008bef272006dccb1162f66b7a3b85f8a26bd4c26b8fb57449cb2d1b"} Apr 16 14:55:18.383794 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:18.383751 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" event={"ID":"4f67a966-a577-4b75-926c-c578f1df2f3a","Type":"ContainerStarted","Data":"4a9b77d846031e47d7a670a8753537f2783b57bde25d4b4716057f4c166ccb5f"} Apr 16 14:55:18.384263 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:18.383799 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" event={"ID":"4f67a966-a577-4b75-926c-c578f1df2f3a","Type":"ContainerStarted","Data":"69298081bbe44763169beac22c5c5cd5ab7005068593ae866e9fccfb20ff857e"} Apr 16 14:55:18.386052 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:18.386025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8j958" event={"ID":"9b8ad91f-8e14-47f4-b2ae-495edea3e670","Type":"ContainerStarted","Data":"3fd074984dd44e2cc455c4cd0d6a8c43278fe3bcb0affd5056772c96b092de81"} Apr 16 14:55:18.401728 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:18.401630 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-6m7v5" podStartSLOduration=16.85551101 podStartE2EDuration="18.401613111s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:16.665152857 +0000 UTC m=+144.318535208" lastFinishedPulling="2026-04-16 14:55:18.211254958 +0000 UTC m=+145.864637309" observedRunningTime="2026-04-16 14:55:18.400925962 +0000 UTC m=+146.054308366" watchObservedRunningTime="2026-04-16 14:55:18.401613111 +0000 UTC m=+146.054995484" Apr 16 14:55:18.418956 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:18.418911 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8j958" podStartSLOduration=8.875687303 podStartE2EDuration="11.41889959s" podCreationTimestamp="2026-04-16 14:55:07 +0000 UTC" firstStartedPulling="2026-04-16 14:55:15.66437916 +0000 UTC m=+143.317761515" lastFinishedPulling="2026-04-16 14:55:18.207591447 +0000 UTC m=+145.860973802" observedRunningTime="2026-04-16 14:55:18.418371032 +0000 UTC m=+146.071753407" watchObservedRunningTime="2026-04-16 14:55:18.41889959 +0000 UTC m=+146.072281963" Apr 16 14:55:22.127814 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:22.127784 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:55:22.130234 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:22.130208 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6202eab-2202-486c-9663-9a51687b0dc8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-4v4pn\" (UID: \"d6202eab-2202-486c-9663-9a51687b0dc8\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:55:22.385852 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:22.385767 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2b4t9\"" Apr 16 14:55:22.393636 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:22.393609 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" Apr 16 14:55:22.507831 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:22.507802 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn"] Apr 16 14:55:22.510900 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:22.510873 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6202eab_2202_486c_9663_9a51687b0dc8.slice/crio-e7f3e040ffce904d9e2ec7e841c585172c413a256e73ec824ae4d4acb0fb483f WatchSource:0}: Error finding container e7f3e040ffce904d9e2ec7e841c585172c413a256e73ec824ae4d4acb0fb483f: Status 404 returned error can't find the container with id e7f3e040ffce904d9e2ec7e841c585172c413a256e73ec824ae4d4acb0fb483f Apr 16 14:55:23.399405 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:23.399362 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" event={"ID":"d6202eab-2202-486c-9663-9a51687b0dc8","Type":"ContainerStarted","Data":"e7f3e040ffce904d9e2ec7e841c585172c413a256e73ec824ae4d4acb0fb483f"} Apr 16 14:55:23.931347 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:23.931320 2580 scope.go:117] "RemoveContainer" containerID="81a1c347bb37e041d9d624b2b89111a90a1bbaadf7e8b98f39d8fad240555086" Apr 16 14:55:24.404146 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.404103 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" event={"ID":"d6202eab-2202-486c-9663-9a51687b0dc8","Type":"ContainerStarted","Data":"8a8d6b348d1134c8659f36f002b25caaad188f5368fd117715f63e2395252acd"} Apr 16 14:55:24.405949 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.405924 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 14:55:24.406074 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.405984 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" event={"ID":"f5d0644e-6880-4f5d-8d37-6b6693b0bfea","Type":"ContainerStarted","Data":"060d50bfc9484f0ec26303f145e3924ab458197efeee2ad89b112ff866c46898"} Apr 16 14:55:24.406280 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.406257 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:24.421774 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.421614 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-4v4pn" podStartSLOduration=32.980747645 podStartE2EDuration="34.421601219s" podCreationTimestamp="2026-04-16 14:54:50 +0000 UTC" firstStartedPulling="2026-04-16 14:55:22.513152719 +0000 UTC m=+150.166535072" lastFinishedPulling="2026-04-16 14:55:23.954006294 +0000 UTC m=+151.607388646" observedRunningTime="2026-04-16 14:55:24.421559408 +0000 UTC m=+152.074941810" watchObservedRunningTime="2026-04-16 14:55:24.421601219 +0000 UTC m=+152.074983594" Apr 16 14:55:24.438495 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.438453 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" podStartSLOduration=21.362881927 podStartE2EDuration="24.438440438s" podCreationTimestamp="2026-04-16 14:55:00 +0000 UTC" firstStartedPulling="2026-04-16 14:55:00.877608419 +0000 UTC m=+128.530990772" lastFinishedPulling="2026-04-16 14:55:03.953166931 +0000 UTC m=+131.606549283" observedRunningTime="2026-04-16 14:55:24.438237661 +0000 UTC m=+152.091620080" watchObservedRunningTime="2026-04-16 14:55:24.438440438 +0000 UTC m=+152.091822813" Apr 16 14:55:24.860252 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:24.860223 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-jxm62" Apr 16 14:55:28.754785 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:28.754745 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dlgsh" podUID="89bc5259-a854-4a23-908c-c4af285bd699" Apr 16 14:55:28.762050 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:28.762024 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lstqc" podUID="a1c49e18-2e18-4c91-9c90-58bd53f03775" Apr 16 14:55:28.947908 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:28.947868 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kgs47" podUID="5c30c303-f0bf-425c-bb3f-ce75dde11fe3" Apr 16 14:55:29.420367 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:29.420340 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:55:29.420515 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:29.420345 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:30.175489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.175451 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-b8wv8"] Apr 16 14:55:30.184031 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.182979 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-594f8c9465-vncdh"] Apr 16 14:55:30.184031 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.183127 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:30.185816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.185767 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xpbjc\"" Apr 16 14:55:30.186652 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.186139 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.186652 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.186197 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:55:30.186652 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.186140 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:55:30.189389 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.189368 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q9vr8\"" Apr 16 14:55:30.189540 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.189387 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:55:30.189645 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.189400 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:55:30.190091 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.190070 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:55:30.190225 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.190206 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-b8wv8"] Apr 16 14:55:30.194548 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.194526 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:55:30.197042 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.197019 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-594f8c9465-vncdh"] Apr 16 14:55:30.283426 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89n8q\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-kube-api-access-89n8q\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283566 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283443 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-image-registry-private-configuration\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283566 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283485 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-trusted-ca\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283566 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283520 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-registry-tls\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283566 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283543 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-registry-certificates\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283566 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283559 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7pw\" (UniqueName: \"kubernetes.io/projected/b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66-kube-api-access-sw7pw\") pod \"downloads-586b57c7b4-b8wv8\" (UID: \"b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66\") " pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:30.283745 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283601 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-installation-pull-secrets\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283745 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283642 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1c716fe-66ae-400c-a328-11d7504d5480-ca-trust-extracted\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.283745 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.283695 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-bound-sa-token\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384064 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384034 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-registry-certificates\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384064 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7pw\" (UniqueName: \"kubernetes.io/projected/b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66-kube-api-access-sw7pw\") pod \"downloads-586b57c7b4-b8wv8\" (UID: \"b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66\") " pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384087 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-installation-pull-secrets\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1c716fe-66ae-400c-a328-11d7504d5480-ca-trust-extracted\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384142 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-bound-sa-token\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384161 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89n8q\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-kube-api-access-89n8q\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384192 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-image-registry-private-configuration\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384209 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-trusted-ca\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-registry-tls\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.384609 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.384585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e1c716fe-66ae-400c-a328-11d7504d5480-ca-trust-extracted\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.385334 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.385311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-trusted-ca\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.385475 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.385444 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e1c716fe-66ae-400c-a328-11d7504d5480-registry-certificates\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.386797 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.386777 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-image-registry-private-configuration\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.386910 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.386812 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-registry-tls\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.386910 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.386899 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e1c716fe-66ae-400c-a328-11d7504d5480-installation-pull-secrets\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.393011 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.392988 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-bound-sa-token\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.393812 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.393791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7pw\" (UniqueName: \"kubernetes.io/projected/b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66-kube-api-access-sw7pw\") pod \"downloads-586b57c7b4-b8wv8\" (UID: \"b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66\") " pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:30.393812 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.393802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89n8q\" (UniqueName: \"kubernetes.io/projected/e1c716fe-66ae-400c-a328-11d7504d5480-kube-api-access-89n8q\") pod \"image-registry-594f8c9465-vncdh\" (UID: \"e1c716fe-66ae-400c-a328-11d7504d5480\") " pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.499058 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.499006 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:30.503582 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.503560 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:30.631377 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.631349 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-b8wv8"] Apr 16 14:55:30.634067 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:30.634043 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9c6c00d_c7ce_49a6_8ad8_1af23b6a7f66.slice/crio-22b76a648a594b63caca0558c71a1cd906a7a4c61ad04080bb76f6b6c055ad84 WatchSource:0}: Error finding container 22b76a648a594b63caca0558c71a1cd906a7a4c61ad04080bb76f6b6c055ad84: Status 404 returned error can't find the container with id 22b76a648a594b63caca0558c71a1cd906a7a4c61ad04080bb76f6b6c055ad84 Apr 16 14:55:30.645723 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:30.645701 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-594f8c9465-vncdh"] Apr 16 14:55:30.648759 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:30.648717 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c716fe_66ae_400c_a328_11d7504d5480.slice/crio-0c17aaf69f603c75d768db2a6ddadf7d755a0d2e70156d31178f7f4ffa2a773b WatchSource:0}: Error finding container 0c17aaf69f603c75d768db2a6ddadf7d755a0d2e70156d31178f7f4ffa2a773b: Status 404 returned error can't find the container with id 0c17aaf69f603c75d768db2a6ddadf7d755a0d2e70156d31178f7f4ffa2a773b Apr 16 14:55:31.427326 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:31.427290 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" event={"ID":"e1c716fe-66ae-400c-a328-11d7504d5480","Type":"ContainerStarted","Data":"cab6659f43b502acf3c9caee0344e8332cdf900f01f0e9e089a41fc1adde3432"} Apr 16 14:55:31.427326 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:31.427327 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" event={"ID":"e1c716fe-66ae-400c-a328-11d7504d5480","Type":"ContainerStarted","Data":"0c17aaf69f603c75d768db2a6ddadf7d755a0d2e70156d31178f7f4ffa2a773b"} Apr 16 14:55:31.427787 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:31.427429 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:31.428392 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:31.428368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-b8wv8" event={"ID":"b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66","Type":"ContainerStarted","Data":"22b76a648a594b63caca0558c71a1cd906a7a4c61ad04080bb76f6b6c055ad84"} Apr 16 14:55:31.447854 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:31.447801 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" podStartSLOduration=1.447790487 podStartE2EDuration="1.447790487s" podCreationTimestamp="2026-04-16 14:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:31.447087669 +0000 UTC m=+159.100470044" watchObservedRunningTime="2026-04-16 14:55:31.447790487 +0000 UTC m=+159.101172861" Apr 16 14:55:33.534163 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.534129 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-pp9nl"] Apr 16 14:55:33.537481 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.537454 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.541306 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.541086 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:55:33.541306 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.541159 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-sfctt\"" Apr 16 14:55:33.541988 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.541965 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:55:33.542092 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.541995 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:55:33.552136 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.552112 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-pp9nl"] Apr 16 14:55:33.605238 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.605211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs5v\" (UniqueName: \"kubernetes.io/projected/db42dd70-d787-44af-bbaa-54821221e8eb-kube-api-access-cxs5v\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.605361 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.605275 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.605361 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.605350 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.605494 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.605468 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db42dd70-d787-44af-bbaa-54821221e8eb-metrics-client-ca\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.706439 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.706411 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs5v\" (UniqueName: \"kubernetes.io/projected/db42dd70-d787-44af-bbaa-54821221e8eb-kube-api-access-cxs5v\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.706580 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.706457 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.706580 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.706484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.706580 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.706571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db42dd70-d787-44af-bbaa-54821221e8eb-metrics-client-ca\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.706764 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:33.706571 2580 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:55:33.706764 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:33.706658 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls podName:db42dd70-d787-44af-bbaa-54821221e8eb nodeName:}" failed. No retries permitted until 2026-04-16 14:55:34.206640672 +0000 UTC m=+161.860023025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls") pod "prometheus-operator-78f957474d-pp9nl" (UID: "db42dd70-d787-44af-bbaa-54821221e8eb") : secret "prometheus-operator-tls" not found Apr 16 14:55:33.707449 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.707423 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db42dd70-d787-44af-bbaa-54821221e8eb-metrics-client-ca\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.709518 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.709499 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.719974 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.719954 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs5v\" (UniqueName: \"kubernetes.io/projected/db42dd70-d787-44af-bbaa-54821221e8eb-kube-api-access-cxs5v\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:33.807711 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.807652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:33.807821 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.807720 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:55:33.810348 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.810302 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89bc5259-a854-4a23-908c-c4af285bd699-metrics-tls\") pod \"dns-default-dlgsh\" (UID: \"89bc5259-a854-4a23-908c-c4af285bd699\") " pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:33.810701 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.810683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c49e18-2e18-4c91-9c90-58bd53f03775-cert\") pod \"ingress-canary-lstqc\" (UID: \"a1c49e18-2e18-4c91-9c90-58bd53f03775\") " pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:55:33.923551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.923522 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nrznh\"" Apr 16 14:55:33.923551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.923542 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9k9b9\"" Apr 16 14:55:33.931338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.931315 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lstqc" Apr 16 14:55:33.931444 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:33.931349 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:34.085514 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.085399 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lstqc"] Apr 16 14:55:34.090210 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:34.090182 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1c49e18_2e18_4c91_9c90_58bd53f03775.slice/crio-9257fded8069d7ef0161a7b0b3618350578ff021fa2c0592ceedfab41ec210a3 WatchSource:0}: Error finding container 9257fded8069d7ef0161a7b0b3618350578ff021fa2c0592ceedfab41ec210a3: Status 404 returned error can't find the container with id 9257fded8069d7ef0161a7b0b3618350578ff021fa2c0592ceedfab41ec210a3 Apr 16 14:55:34.102934 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.102910 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dlgsh"] Apr 16 14:55:34.105582 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:34.105558 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89bc5259_a854_4a23_908c_c4af285bd699.slice/crio-9f44f3a83e5bf10d199178a9aa9e3f684127deef8c1e8e68ac87a07b574dc99d WatchSource:0}: Error finding container 9f44f3a83e5bf10d199178a9aa9e3f684127deef8c1e8e68ac87a07b574dc99d: Status 404 returned error can't find the container with id 9f44f3a83e5bf10d199178a9aa9e3f684127deef8c1e8e68ac87a07b574dc99d Apr 16 14:55:34.211506 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.211475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:34.214115 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.214085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/db42dd70-d787-44af-bbaa-54821221e8eb-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-pp9nl\" (UID: \"db42dd70-d787-44af-bbaa-54821221e8eb\") " pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:34.437332 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.437289 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlgsh" event={"ID":"89bc5259-a854-4a23-908c-c4af285bd699","Type":"ContainerStarted","Data":"9f44f3a83e5bf10d199178a9aa9e3f684127deef8c1e8e68ac87a07b574dc99d"} Apr 16 14:55:34.438450 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.438417 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lstqc" event={"ID":"a1c49e18-2e18-4c91-9c90-58bd53f03775","Type":"ContainerStarted","Data":"9257fded8069d7ef0161a7b0b3618350578ff021fa2c0592ceedfab41ec210a3"} Apr 16 14:55:34.448743 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.448712 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" Apr 16 14:55:34.611001 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:34.610950 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-pp9nl"] Apr 16 14:55:35.447169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:35.447128 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" event={"ID":"db42dd70-d787-44af-bbaa-54821221e8eb","Type":"ContainerStarted","Data":"5a0da210969a2ccf1aede36a544d00bb291bfe03d328aa529b8fc3c010d27389"} Apr 16 14:55:37.460536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.460494 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlgsh" event={"ID":"89bc5259-a854-4a23-908c-c4af285bd699","Type":"ContainerStarted","Data":"4e7647355fd1203091eadb6697b13171b30060a6257fce1339c9b208901c0cf0"} Apr 16 14:55:37.460536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.460535 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dlgsh" event={"ID":"89bc5259-a854-4a23-908c-c4af285bd699","Type":"ContainerStarted","Data":"5240be22ae6f5444376e68692bba5aa063827a41d79a9f82a93ad99e87e2fa20"} Apr 16 14:55:37.461097 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.460622 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:37.462682 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.462659 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lstqc" event={"ID":"a1c49e18-2e18-4c91-9c90-58bd53f03775","Type":"ContainerStarted","Data":"ab23daa7f93cdce3293c5d97e1bc414466a4926fca62164881f4f6b6bd5f6b4c"} Apr 16 14:55:37.464410 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.464374 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" event={"ID":"db42dd70-d787-44af-bbaa-54821221e8eb","Type":"ContainerStarted","Data":"b5eeffc51e5f348583b93c7e16553787e1ec0d9a0a99740c5683a09475aa295c"} Apr 16 14:55:37.464410 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.464403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" event={"ID":"db42dd70-d787-44af-bbaa-54821221e8eb","Type":"ContainerStarted","Data":"a90f737bafdeaf3c0dfb789764b2cebd6aeb0b7bd964f03cb590d76ac0fd7a7c"} Apr 16 14:55:37.481547 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.481481 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dlgsh" podStartSLOduration=129.953245579 podStartE2EDuration="2m12.481464461s" podCreationTimestamp="2026-04-16 14:53:25 +0000 UTC" firstStartedPulling="2026-04-16 14:55:34.107597472 +0000 UTC m=+161.760979825" lastFinishedPulling="2026-04-16 14:55:36.63581635 +0000 UTC m=+164.289198707" observedRunningTime="2026-04-16 14:55:37.480329156 +0000 UTC m=+165.133711531" watchObservedRunningTime="2026-04-16 14:55:37.481464461 +0000 UTC m=+165.134846837" Apr 16 14:55:37.498241 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.498195 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-pp9nl" podStartSLOduration=2.477287466 podStartE2EDuration="4.498185279s" podCreationTimestamp="2026-04-16 14:55:33 +0000 UTC" firstStartedPulling="2026-04-16 14:55:34.617311972 +0000 UTC m=+162.270694327" lastFinishedPulling="2026-04-16 14:55:36.638209773 +0000 UTC m=+164.291592140" observedRunningTime="2026-04-16 14:55:37.497212458 +0000 UTC m=+165.150594832" watchObservedRunningTime="2026-04-16 14:55:37.498185279 +0000 UTC m=+165.151567656" Apr 16 14:55:37.514700 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:37.514665 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lstqc" podStartSLOduration=129.967748165 podStartE2EDuration="2m12.514657721s" podCreationTimestamp="2026-04-16 14:53:25 +0000 UTC" firstStartedPulling="2026-04-16 14:55:34.092695769 +0000 UTC m=+161.746078136" lastFinishedPulling="2026-04-16 14:55:36.639605324 +0000 UTC m=+164.292987692" observedRunningTime="2026-04-16 14:55:37.514264792 +0000 UTC m=+165.167647188" watchObservedRunningTime="2026-04-16 14:55:37.514657721 +0000 UTC m=+165.168040074" Apr 16 14:55:39.921829 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.921792 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-s8gz9"] Apr 16 14:55:39.924736 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.924713 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.927356 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.926884 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ld94b\"" Apr 16 14:55:39.927356 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.927190 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:55:39.927551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.927504 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:55:39.927760 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.927745 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:55:39.937534 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.937131 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-s8gz9"] Apr 16 14:55:39.944110 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.944084 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lhvl9"] Apr 16 14:55:39.946668 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.946645 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.949175 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.949155 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:55:39.949299 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.949225 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:55:39.949299 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.949252 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-48tx8\"" Apr 16 14:55:39.949299 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.949160 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:55:39.959909 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.959885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-sys\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960070 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-root\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960233 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960196 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.960338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-metrics-client-ca\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960262 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.960338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-wtmp\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960509 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90d7f787-82ab-4219-b168-822d1a382c4e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.960509 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960363 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589f2\" (UniqueName: \"kubernetes.io/projected/90d7f787-82ab-4219-b168-822d1a382c4e-kube-api-access-589f2\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.960509 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-textfile\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960509 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-accelerators-collector-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:39.960704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960603 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfkl\" (UniqueName: \"kubernetes.io/projected/9a6c274c-eb63-47f4-95cb-022efb3ff364-kube-api-access-lnfkl\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.960704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960674 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:39.961019 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:39.960701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.061700 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061623 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-root\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.061700 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061685 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-root\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-metrics-client-ca\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061776 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061823 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-wtmp\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061875 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90d7f787-82ab-4219-b168-822d1a382c4e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061902 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-589f2\" (UniqueName: \"kubernetes.io/projected/90d7f787-82ab-4219-b168-822d1a382c4e-kube-api-access-589f2\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-textfile\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.061979 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.061962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-accelerators-collector-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062007 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062013 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-wtmp\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062048 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfkl\" (UniqueName: \"kubernetes.io/projected/9a6c274c-eb63-47f4-95cb-022efb3ff364-kube-api-access-lnfkl\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062132 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-sys\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062258 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a6c274c-eb63-47f4-95cb-022efb3ff364-sys\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062370 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-textfile\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062811 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062518 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90d7f787-82ab-4219-b168-822d1a382c4e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.062811 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-accelerators-collector-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062811 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.062774 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9a6c274c-eb63-47f4-95cb-022efb3ff364-metrics-client-ca\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.062990 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:40.062819 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:55:40.062990 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:55:40.062900 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls podName:9a6c274c-eb63-47f4-95cb-022efb3ff364 nodeName:}" failed. No retries permitted until 2026-04-16 14:55:40.56287996 +0000 UTC m=+168.216262314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls") pod "node-exporter-lhvl9" (UID: "9a6c274c-eb63-47f4-95cb-022efb3ff364") : secret "node-exporter-tls" not found Apr 16 14:55:40.063408 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.063381 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.063536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.063485 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90d7f787-82ab-4219-b168-822d1a382c4e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.064766 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.064730 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.065763 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.065738 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.066143 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.066123 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90d7f787-82ab-4219-b168-822d1a382c4e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.071639 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.071621 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfkl\" (UniqueName: \"kubernetes.io/projected/9a6c274c-eb63-47f4-95cb-022efb3ff364-kube-api-access-lnfkl\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.071926 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.071905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-589f2\" (UniqueName: \"kubernetes.io/projected/90d7f787-82ab-4219-b168-822d1a382c4e-kube-api-access-589f2\") pod \"kube-state-metrics-7479c89684-s8gz9\" (UID: \"90d7f787-82ab-4219-b168-822d1a382c4e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.243895 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.243803 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" Apr 16 14:55:40.566878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.566772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.569568 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.569542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9a6c274c-eb63-47f4-95cb-022efb3ff364-node-exporter-tls\") pod \"node-exporter-lhvl9\" (UID: \"9a6c274c-eb63-47f4-95cb-022efb3ff364\") " pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:40.859229 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:40.859201 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lhvl9" Apr 16 14:55:43.930755 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:43.930719 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:55:46.160078 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.160041 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:46.165452 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.165428 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.168055 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168028 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:55:46.168179 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168134 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168533 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168546 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-nqohqcqthhs4\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168572 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168588 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168674 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:55:46.168877 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.168725 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:55:46.169149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.169103 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:55:46.169534 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.169278 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:55:46.169534 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.169310 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:55:46.169655 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.169636 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:55:46.170593 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.170549 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xnggt\"" Apr 16 14:55:46.172440 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.172197 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:55:46.172440 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.172288 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:55:46.185782 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.185762 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:46.218160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218130 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218188 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218316 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218276 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218496 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218495 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218551 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218525 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218659 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218564 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218659 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218588 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218659 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218613 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218657 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218690 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsqws\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218717 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218743 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.218816 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218786 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.219074 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218819 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.219074 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218870 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.219074 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.218899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319574 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319574 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319718 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.319815 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319876 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319930 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319962 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.319989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsqws\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.320014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.320039 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.320090 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320631 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.320430 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.320686 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.320645 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.321111 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.321025 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.322914 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.322711 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.324433 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.324292 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.324598 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.324563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.326058 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.326003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.326526 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.326490 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.326645 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.326578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.326993 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.326732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.326993 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.326917 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.327534 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.327505 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.328189 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.328167 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.328189 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.328177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.328913 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.328888 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.329320 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.329295 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.329705 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.329686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.330220 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.330200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsqws\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws\") pod \"prometheus-k8s-0\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.478960 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:46.478866 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:55:46.982730 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:46.982692 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a6c274c_eb63_47f4_95cb_022efb3ff364.slice/crio-ed1481372aa927fcb8fcffd716f4c3f2ac7270a5ed1f520b5532cd2bbb38d0ad WatchSource:0}: Error finding container ed1481372aa927fcb8fcffd716f4c3f2ac7270a5ed1f520b5532cd2bbb38d0ad: Status 404 returned error can't find the container with id ed1481372aa927fcb8fcffd716f4c3f2ac7270a5ed1f520b5532cd2bbb38d0ad Apr 16 14:55:47.125684 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.125657 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:55:47.128331 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:47.128302 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bcf4cba_e4bb_420b_910e_252d93e6275a.slice/crio-b9bfdb4202fdf3cdd4a050609b4da3ba6f9c3df9e29a0c62f9858468155dcdc0 WatchSource:0}: Error finding container b9bfdb4202fdf3cdd4a050609b4da3ba6f9c3df9e29a0c62f9858468155dcdc0: Status 404 returned error can't find the container with id b9bfdb4202fdf3cdd4a050609b4da3ba6f9c3df9e29a0c62f9858468155dcdc0 Apr 16 14:55:47.131379 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.131359 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-s8gz9"] Apr 16 14:55:47.135106 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:55:47.135084 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d7f787_82ab_4219_b168_822d1a382c4e.slice/crio-0cc2cb50a8e37cc5c2c2b4d872704280e0ce1cae4c59e7abbfd6da44e235218a WatchSource:0}: Error finding container 0cc2cb50a8e37cc5c2c2b4d872704280e0ce1cae4c59e7abbfd6da44e235218a: Status 404 returned error can't find the container with id 0cc2cb50a8e37cc5c2c2b4d872704280e0ce1cae4c59e7abbfd6da44e235218a Apr 16 14:55:47.470682 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.470646 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dlgsh" Apr 16 14:55:47.502246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.501675 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-b8wv8" event={"ID":"b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66","Type":"ContainerStarted","Data":"49fa896d8bf363a7f6524d1aa8a64671c721c41338dacb3a9d482da1bef5f0ea"} Apr 16 14:55:47.502246 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.502150 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:47.505265 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.505224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lhvl9" event={"ID":"9a6c274c-eb63-47f4-95cb-022efb3ff364","Type":"ContainerStarted","Data":"ed1481372aa927fcb8fcffd716f4c3f2ac7270a5ed1f520b5532cd2bbb38d0ad"} Apr 16 14:55:47.507254 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.507228 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" event={"ID":"90d7f787-82ab-4219-b168-822d1a382c4e","Type":"ContainerStarted","Data":"0cc2cb50a8e37cc5c2c2b4d872704280e0ce1cae4c59e7abbfd6da44e235218a"} Apr 16 14:55:47.509429 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.509379 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"b9bfdb4202fdf3cdd4a050609b4da3ba6f9c3df9e29a0c62f9858468155dcdc0"} Apr 16 14:55:47.516188 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.516167 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-b8wv8" Apr 16 14:55:47.540225 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:47.540166 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-b8wv8" podStartSLOduration=1.133862056 podStartE2EDuration="17.540146039s" podCreationTimestamp="2026-04-16 14:55:30 +0000 UTC" firstStartedPulling="2026-04-16 14:55:30.635938104 +0000 UTC m=+158.289320456" lastFinishedPulling="2026-04-16 14:55:47.042222087 +0000 UTC m=+174.695604439" observedRunningTime="2026-04-16 14:55:47.537139775 +0000 UTC m=+175.190522151" watchObservedRunningTime="2026-04-16 14:55:47.540146039 +0000 UTC m=+175.193528414" Apr 16 14:55:49.519241 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.519154 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" event={"ID":"90d7f787-82ab-4219-b168-822d1a382c4e","Type":"ContainerStarted","Data":"1b173cc2924ca0f33a0fa04a6bbdc3f76d00fb63dbbc4609d8f6589e1d0a31e3"} Apr 16 14:55:49.519241 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.519204 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" event={"ID":"90d7f787-82ab-4219-b168-822d1a382c4e","Type":"ContainerStarted","Data":"7a7c1bf3a89626a0735fe7aed51d8cd13532d1be74007a2d10bd6f2493408494"} Apr 16 14:55:49.519241 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.519220 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" event={"ID":"90d7f787-82ab-4219-b168-822d1a382c4e","Type":"ContainerStarted","Data":"8a2b3f7aab1d2ac9857b1d53000aa5b76eb8963aa8f6cd27f504725fbb8ac2e4"} Apr 16 14:55:49.520761 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.520724 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" exitCode=0 Apr 16 14:55:49.520901 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.520874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} Apr 16 14:55:49.523165 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.522488 2580 generic.go:358] "Generic (PLEG): container finished" podID="9a6c274c-eb63-47f4-95cb-022efb3ff364" containerID="aa56a8750eba01a65bbe51f760f98b3716ca94758c700aa40c5709a450323cfc" exitCode=0 Apr 16 14:55:49.523165 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.522579 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lhvl9" event={"ID":"9a6c274c-eb63-47f4-95cb-022efb3ff364","Type":"ContainerDied","Data":"aa56a8750eba01a65bbe51f760f98b3716ca94758c700aa40c5709a450323cfc"} Apr 16 14:55:49.539233 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:49.539183 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-s8gz9" podStartSLOduration=8.647304839 podStartE2EDuration="10.539167518s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="2026-04-16 14:55:47.137037748 +0000 UTC m=+174.790420100" lastFinishedPulling="2026-04-16 14:55:49.028900427 +0000 UTC m=+176.682282779" observedRunningTime="2026-04-16 14:55:49.537250786 +0000 UTC m=+177.190633162" watchObservedRunningTime="2026-04-16 14:55:49.539167518 +0000 UTC m=+177.192549893" Apr 16 14:55:50.528536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:50.528492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lhvl9" event={"ID":"9a6c274c-eb63-47f4-95cb-022efb3ff364","Type":"ContainerStarted","Data":"4af07fb24c381f34cffa03c197e69e58a41dc46c3875d048d61c4eb41d395348"} Apr 16 14:55:50.529049 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:50.528543 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lhvl9" event={"ID":"9a6c274c-eb63-47f4-95cb-022efb3ff364","Type":"ContainerStarted","Data":"e850b365d6d79a5c12d2f3ecff12beec0c7d3063ab2a6249635cdd4c9489f0fa"} Apr 16 14:55:50.552225 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:50.552061 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lhvl9" podStartSLOduration=9.535498844 podStartE2EDuration="11.552044125s" podCreationTimestamp="2026-04-16 14:55:39 +0000 UTC" firstStartedPulling="2026-04-16 14:55:46.984907017 +0000 UTC m=+174.638289435" lastFinishedPulling="2026-04-16 14:55:49.001452347 +0000 UTC m=+176.654834716" observedRunningTime="2026-04-16 14:55:50.551760119 +0000 UTC m=+178.205142493" watchObservedRunningTime="2026-04-16 14:55:50.552044125 +0000 UTC m=+178.205426499" Apr 16 14:55:52.436686 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:52.436652 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-594f8c9465-vncdh" Apr 16 14:55:53.542338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:53.542298 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} Apr 16 14:55:53.542338 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:53.542342 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} Apr 16 14:55:56.556504 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:56.556470 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} Apr 16 14:55:56.556868 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:56.556518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} Apr 16 14:55:56.556868 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:56.556531 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} Apr 16 14:55:57.563601 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:57.563559 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerStarted","Data":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} Apr 16 14:55:57.595892 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:55:57.595815 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.65249132 podStartE2EDuration="11.595799018s" podCreationTimestamp="2026-04-16 14:55:46 +0000 UTC" firstStartedPulling="2026-04-16 14:55:47.130631323 +0000 UTC m=+174.784013689" lastFinishedPulling="2026-04-16 14:55:56.073939034 +0000 UTC m=+183.727321387" observedRunningTime="2026-04-16 14:55:57.59329456 +0000 UTC m=+185.246676965" watchObservedRunningTime="2026-04-16 14:55:57.595799018 +0000 UTC m=+185.249181393" Apr 16 14:56:01.480007 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:01.479972 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:09.598536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:09.598504 2580 generic.go:358] "Generic (PLEG): container finished" podID="59faa17e-f4a6-43d2-97a3-9144f068504f" containerID="80f4b39c39299842e19d30b2caf95d31a7f892538e162c8ca02eb03f76cb20ae" exitCode=0 Apr 16 14:56:09.598988 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:09.598577 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" event={"ID":"59faa17e-f4a6-43d2-97a3-9144f068504f","Type":"ContainerDied","Data":"80f4b39c39299842e19d30b2caf95d31a7f892538e162c8ca02eb03f76cb20ae"} Apr 16 14:56:09.598988 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:09.598918 2580 scope.go:117] "RemoveContainer" containerID="80f4b39c39299842e19d30b2caf95d31a7f892538e162c8ca02eb03f76cb20ae" Apr 16 14:56:10.602797 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:10.602762 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-ldsf5" event={"ID":"59faa17e-f4a6-43d2-97a3-9144f068504f","Type":"ContainerStarted","Data":"c36fdeeeed0a20bb66e05c6c6cd3748841d77ac63356672988f068320ed373d8"} Apr 16 14:56:14.614345 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:14.614311 2580 generic.go:358] "Generic (PLEG): container finished" podID="2ff993dc-1d95-4aaf-b8a3-233fbf6081af" containerID="9976fcf586db8201915d262cd810becc94d782d34d7c893e0bcb65b98f3b93f6" exitCode=0 Apr 16 14:56:14.614746 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:14.614376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" event={"ID":"2ff993dc-1d95-4aaf-b8a3-233fbf6081af","Type":"ContainerDied","Data":"9976fcf586db8201915d262cd810becc94d782d34d7c893e0bcb65b98f3b93f6"} Apr 16 14:56:14.614746 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:14.614710 2580 scope.go:117] "RemoveContainer" containerID="9976fcf586db8201915d262cd810becc94d782d34d7c893e0bcb65b98f3b93f6" Apr 16 14:56:15.621667 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:15.621634 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-pfnkn" event={"ID":"2ff993dc-1d95-4aaf-b8a3-233fbf6081af","Type":"ContainerStarted","Data":"7a526d766db0acd959539212f40d163df5e49513238522972120a32f6bfca5a1"} Apr 16 14:56:18.631460 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:18.631430 2580 generic.go:358] "Generic (PLEG): container finished" podID="9915bbf3-08d3-4eb4-b977-389f37e66425" containerID="2b55ed2e3ff1817cd041820aa6d0419a81465872f73f68501c3e7c9f0ba0d3a2" exitCode=0 Apr 16 14:56:18.631813 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:18.631501 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" event={"ID":"9915bbf3-08d3-4eb4-b977-389f37e66425","Type":"ContainerDied","Data":"2b55ed2e3ff1817cd041820aa6d0419a81465872f73f68501c3e7c9f0ba0d3a2"} Apr 16 14:56:18.631813 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:18.631796 2580 scope.go:117] "RemoveContainer" containerID="2b55ed2e3ff1817cd041820aa6d0419a81465872f73f68501c3e7c9f0ba0d3a2" Apr 16 14:56:19.635479 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:19.635444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-tx72c" event={"ID":"9915bbf3-08d3-4eb4-b977-389f37e66425","Type":"ContainerStarted","Data":"f3a14680a08688ba97bea5ffcdcfaea84ec90d62ab10aaeb959d20e64cef3a27"} Apr 16 14:56:46.479182 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:46.479128 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:46.498554 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:46.498522 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:56:46.727208 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:56:46.727172 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:04.477361 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477283 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:04.477782 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477723 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="prometheus" containerID="cri-o://a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" gracePeriod=600 Apr 16 14:57:04.477782 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477759 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="thanos-sidecar" containerID="cri-o://60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" gracePeriod=600 Apr 16 14:57:04.477878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477756 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-thanos" containerID="cri-o://78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" gracePeriod=600 Apr 16 14:57:04.477878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477774 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-web" containerID="cri-o://e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" gracePeriod=600 Apr 16 14:57:04.477878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477813 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy" containerID="cri-o://75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" gracePeriod=600 Apr 16 14:57:04.477878 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.477762 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="config-reloader" containerID="cri-o://36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" gracePeriod=600 Apr 16 14:57:04.676675 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.676629 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:57:04.679101 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.679071 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c30c303-f0bf-425c-bb3f-ce75dde11fe3-metrics-certs\") pod \"network-metrics-daemon-kgs47\" (UID: \"5c30c303-f0bf-425c-bb3f-ce75dde11fe3\") " pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:57:04.718906 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.718735 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767345 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" exitCode=0 Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767370 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" exitCode=0 Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767377 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" exitCode=0 Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767383 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" exitCode=0 Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767389 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" exitCode=0 Apr 16 14:57:04.767430 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767396 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" exitCode=0 Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767444 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767464 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767480 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767471 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767621 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767672 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767687 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} Apr 16 14:57:04.767824 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.767701 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8bcf4cba-e4bb-420b-910e-252d93e6275a","Type":"ContainerDied","Data":"b9bfdb4202fdf3cdd4a050609b4da3ba6f9c3df9e29a0c62f9858468155dcdc0"} Apr 16 14:57:04.775502 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.775479 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.783507 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.783483 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.791862 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.791770 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.800041 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.800020 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.808943 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.808926 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.815763 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.815743 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.822270 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.822253 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.822524 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.822499 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.822581 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.822535 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.822581 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.822569 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.822799 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.822781 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.822874 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.822804 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.822874 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.822816 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.823076 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.823059 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.823118 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823083 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.823118 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823099 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.823313 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.823292 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.823393 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823323 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.823393 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823339 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.823573 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.823552 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.823616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823579 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.823616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823593 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.823810 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.823795 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.823892 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823814 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.823892 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.823828 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.824083 ip-10-0-137-160 kubenswrapper[2580]: E0416 14:57:04.824063 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.824160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824086 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.824160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824100 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.824331 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824312 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.824386 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824331 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.824616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824596 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.824689 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824618 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.824879 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824857 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.824926 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.824881 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.825079 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825063 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.825124 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825080 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.825263 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825245 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.825322 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825265 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.825483 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825468 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.825530 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825483 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.825694 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825677 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.825736 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825695 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.825910 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825893 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.825982 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.825912 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.826145 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826128 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.826196 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826146 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.826355 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826335 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.826396 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826356 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.826545 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826529 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.826610 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826547 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.826758 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826742 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.826802 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826758 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.826956 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826937 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.827030 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.826957 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.827190 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827173 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.827260 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827192 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.827413 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827395 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.827460 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827414 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.827608 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827591 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.827667 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827608 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.827831 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827812 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.827904 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.827832 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.828093 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828071 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.828132 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828093 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.828284 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828266 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.828345 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828285 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.828463 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828446 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.828508 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828463 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.828651 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828637 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.828696 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828652 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.828818 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828802 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.828897 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.828818 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.829026 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829010 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.829068 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829031 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.829240 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829225 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.829278 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829240 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.829476 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829458 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.829540 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829478 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.829673 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829655 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.829736 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829674 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.829884 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829859 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.829884 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.829882 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.830089 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830074 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.830131 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830089 2580 scope.go:117] "RemoveContainer" containerID="78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb" Apr 16 14:57:04.830340 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830297 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb"} err="failed to get container status \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": rpc error: code = NotFound desc = could not find container \"78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb\": container with ID starting with 78d2acf02a49e87e8f8436784b2e93730fb780bb882bc70eea674fd7cd598fbb not found: ID does not exist" Apr 16 14:57:04.830383 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830341 2580 scope.go:117] "RemoveContainer" containerID="75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd" Apr 16 14:57:04.830569 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830551 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd"} err="failed to get container status \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": rpc error: code = NotFound desc = could not find container \"75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd\": container with ID starting with 75ff20dba7462d8d6da605d69f7bd21d2949bee27e33f3f8433d862fcc4991bd not found: ID does not exist" Apr 16 14:57:04.830569 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830568 2580 scope.go:117] "RemoveContainer" containerID="e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065" Apr 16 14:57:04.830784 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830767 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065"} err="failed to get container status \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": rpc error: code = NotFound desc = could not find container \"e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065\": container with ID starting with e2823c9025c687c03a579071f1279d313151cc404bf26a67296c6e9d707c3065 not found: ID does not exist" Apr 16 14:57:04.830859 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.830784 2580 scope.go:117] "RemoveContainer" containerID="60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12" Apr 16 14:57:04.831024 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831008 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12"} err="failed to get container status \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": rpc error: code = NotFound desc = could not find container \"60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12\": container with ID starting with 60013ec88e564794456ec8a50e0a049a84cd06223050f330a394f9560cce6e12 not found: ID does not exist" Apr 16 14:57:04.831063 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831025 2580 scope.go:117] "RemoveContainer" containerID="36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48" Apr 16 14:57:04.831222 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831208 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48"} err="failed to get container status \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": rpc error: code = NotFound desc = could not find container \"36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48\": container with ID starting with 36cf3a2b6ace3b9981f60efc0a97af138939eb4e462ff2384c2eaf6761e7fe48 not found: ID does not exist" Apr 16 14:57:04.831262 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831222 2580 scope.go:117] "RemoveContainer" containerID="a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060" Apr 16 14:57:04.831417 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831403 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060"} err="failed to get container status \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": rpc error: code = NotFound desc = could not find container \"a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060\": container with ID starting with a0af7f9a95f2907cabae439123021d90d21ac58cc31d915f0d3fa2c081010060 not found: ID does not exist" Apr 16 14:57:04.831463 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831417 2580 scope.go:117] "RemoveContainer" containerID="ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13" Apr 16 14:57:04.831605 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.831590 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13"} err="failed to get container status \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": rpc error: code = NotFound desc = could not find container \"ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13\": container with ID starting with ce7c4edd5dfad9967e4264672dc17baaa38f8c0ac0cc4e4ba694605443768c13 not found: ID does not exist" Apr 16 14:57:04.878077 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878058 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878085 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsqws\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878106 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878123 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878149 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878144 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878347 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878170 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878347 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878225 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878347 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878252 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.878347 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878282 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878716 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878736 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878808 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878899 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878931 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.878957 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879000 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879026 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879054 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879079 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879158 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db\") pod \"8bcf4cba-e4bb-420b-910e-252d93e6275a\" (UID: \"8bcf4cba-e4bb-420b-910e-252d93e6275a\") " Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879432 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879452 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-metrics-client-ca\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.879951 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:04.880616 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.880385 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:04.881281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.880763 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:57:04.881281 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.880768 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:57:04.881535 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.881511 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws" (OuterVolumeSpecName: "kube-api-access-wsqws") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "kube-api-access-wsqws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:04.881798 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.881772 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.881914 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.881799 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config" (OuterVolumeSpecName: "config") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.882078 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.882055 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out" (OuterVolumeSpecName: "config-out") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:57:04.882292 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.882229 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.883261 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.883228 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.883427 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.883400 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:57:04.883903 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.883882 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.884054 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.884034 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.884129 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.884060 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.884129 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.884111 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.892521 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.892498 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config" (OuterVolumeSpecName: "web-config") pod "8bcf4cba-e4bb-420b-910e-252d93e6275a" (UID: "8bcf4cba-e4bb-420b-910e-252d93e6275a"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:57:04.933327 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.933305 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lnt9d\"" Apr 16 14:57:04.941278 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.941255 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kgs47" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980696 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-kube-rbac-proxy\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980731 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-config-out\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980754 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-web-config\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980768 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980781 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-config\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980796 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980816 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980859 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-tls-assets\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980874 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-metrics-client-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980888 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8bcf4cba-e4bb-420b-910e-252d93e6275a-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980907 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980921 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-grpc-tls\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980934 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980954 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8bcf4cba-e4bb-420b-910e-252d93e6275a-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980969 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8bcf4cba-e4bb-420b-910e-252d93e6275a-prometheus-k8s-db\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:04.985018 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:04.980984 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wsqws\" (UniqueName: \"kubernetes.io/projected/8bcf4cba-e4bb-420b-910e-252d93e6275a-kube-api-access-wsqws\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 14:57:05.058949 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.058925 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kgs47"] Apr 16 14:57:05.061348 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:57:05.061321 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c30c303_f0bf_425c_bb3f_ce75dde11fe3.slice/crio-953f22b05546f562009b9215237636a7cce53872da978eeb119cc2a8931312c7 WatchSource:0}: Error finding container 953f22b05546f562009b9215237636a7cce53872da978eeb119cc2a8931312c7: Status 404 returned error can't find the container with id 953f22b05546f562009b9215237636a7cce53872da978eeb119cc2a8931312c7 Apr 16 14:57:05.087100 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.087079 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:05.090663 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.090642 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:05.115162 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115138 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:05.115449 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115435 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="prometheus" Apr 16 14:57:05.115489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115451 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="prometheus" Apr 16 14:57:05.115489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115463 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-web" Apr 16 14:57:05.115489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115469 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-web" Apr 16 14:57:05.115489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115483 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="init-config-reloader" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115492 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="init-config-reloader" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115503 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="config-reloader" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115509 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="config-reloader" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115515 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="thanos-sidecar" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115520 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="thanos-sidecar" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115527 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115532 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115538 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-thanos" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115543 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-thanos" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115590 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-thanos" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115599 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="thanos-sidecar" Apr 16 14:57:05.115604 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115606 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="config-reloader" Apr 16 14:57:05.115952 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115612 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="prometheus" Apr 16 14:57:05.115952 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115618 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy-web" Apr 16 14:57:05.115952 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.115625 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" containerName="kube-rbac-proxy" Apr 16 14:57:05.121593 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.121572 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.123914 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.123891 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:57:05.123997 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.123923 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:57:05.123997 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.123893 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xnggt\"" Apr 16 14:57:05.123997 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.123974 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:57:05.124210 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124196 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:57:05.124280 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124263 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:57:05.124325 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124285 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:57:05.124523 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124505 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:57:05.124704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124540 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-nqohqcqthhs4\"" Apr 16 14:57:05.124704 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124619 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:57:05.124881 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124867 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:57:05.124881 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.124876 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:57:05.125028 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.125012 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:57:05.127368 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.127349 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:57:05.131160 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.131141 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:57:05.131798 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.131780 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:05.283251 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283217 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283251 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283274 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283290 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283427 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283520 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283535 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283577 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283604 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5467h\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-kube-api-access-5467h\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283656 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283716 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.283791 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283742 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.284073 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.283797 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385232 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385199 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385376 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385235 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385376 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385258 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385377 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5467h\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-kube-api-access-5467h\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385489 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385630 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385492 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.385800 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385880 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385917 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.385990 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386066 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386091 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386141 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386169 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.386942 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.386437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.388352 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.388322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config-out\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389037 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.388613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389037 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.388789 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389037 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.388913 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389245 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.389102 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389335 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.389311 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389401 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.389380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.389522 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.389498 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.390203 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.390177 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.390303 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.390229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.390589 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.390566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.390707 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.390686 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.391646 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.391627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.391722 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.391706 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.391768 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.391740 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-web-config\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.395702 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.395680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5467h\" (UniqueName: \"kubernetes.io/projected/6ba0d8cd-bc51-43f5-843d-a9ab7a40000c-kube-api-access-5467h\") pod \"prometheus-k8s-0\" (UID: \"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.432762 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.432744 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:05.583126 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.582779 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:57:05.586020 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:57:05.585994 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba0d8cd_bc51_43f5_843d_a9ab7a40000c.slice/crio-41b36b709e63178a2d843104815fbad8479bf50d4c0b290edb06e4eef79cbb0f WatchSource:0}: Error finding container 41b36b709e63178a2d843104815fbad8479bf50d4c0b290edb06e4eef79cbb0f: Status 404 returned error can't find the container with id 41b36b709e63178a2d843104815fbad8479bf50d4c0b290edb06e4eef79cbb0f Apr 16 14:57:05.771928 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.771892 2580 generic.go:358] "Generic (PLEG): container finished" podID="6ba0d8cd-bc51-43f5-843d-a9ab7a40000c" containerID="af10f952e5824f0c354bf90efe82cdd1bbb4720ce1ec54dc2f06a59380389342" exitCode=0 Apr 16 14:57:05.772063 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.771988 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerDied","Data":"af10f952e5824f0c354bf90efe82cdd1bbb4720ce1ec54dc2f06a59380389342"} Apr 16 14:57:05.772063 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.772030 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"41b36b709e63178a2d843104815fbad8479bf50d4c0b290edb06e4eef79cbb0f"} Apr 16 14:57:05.774133 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:05.774107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgs47" event={"ID":"5c30c303-f0bf-425c-bb3f-ce75dde11fe3","Type":"ContainerStarted","Data":"953f22b05546f562009b9215237636a7cce53872da978eeb119cc2a8931312c7"} Apr 16 14:57:06.780107 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780075 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"bb0e9c840c47b88b3bebe32ae2855a586df42e8fd9b95ef3ea7b2d264d498ebd"} Apr 16 14:57:06.780107 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780109 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"8d7fbf1ec270bbf2a2cfc2cf36ead4726d5bd96a644d15b58863fa2f3ef2092b"} Apr 16 14:57:06.780564 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"dfc62c963fba79cfb2dfe07f3edfffa190bb5927104f4d60f593a94200cceaf5"} Apr 16 14:57:06.780564 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780133 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"5a528bfc22b69881c8e0690196ba07011f2f1f2fb965919baff998ccf162a73f"} Apr 16 14:57:06.780564 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780146 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"c1b414bd534abf303a36a14b7bfcb601f839cf167ba893e2b84222078db1b6d8"} Apr 16 14:57:06.780564 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.780158 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6ba0d8cd-bc51-43f5-843d-a9ab7a40000c","Type":"ContainerStarted","Data":"674b25b6e0baa11285d23f8ccf3dc41c913c4632cb21515336197ccbee5667a9"} Apr 16 14:57:06.781708 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.781686 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgs47" event={"ID":"5c30c303-f0bf-425c-bb3f-ce75dde11fe3","Type":"ContainerStarted","Data":"386a466b79d0e5da526817c54981c73cf85a89eafc492cf971d449d37f78c743"} Apr 16 14:57:06.781800 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.781714 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kgs47" event={"ID":"5c30c303-f0bf-425c-bb3f-ce75dde11fe3","Type":"ContainerStarted","Data":"ea2c6c4410064ad172b5ad9c7ef5d31c12fb333c87e8c2c91278c33de53b44ac"} Apr 16 14:57:06.807096 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.807057 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.807040891 podStartE2EDuration="1.807040891s" podCreationTimestamp="2026-04-16 14:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:06.805280359 +0000 UTC m=+254.458662734" watchObservedRunningTime="2026-04-16 14:57:06.807040891 +0000 UTC m=+254.460423267" Apr 16 14:57:06.820450 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.820412 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kgs47" podStartSLOduration=252.818687098 podStartE2EDuration="4m13.820396916s" podCreationTimestamp="2026-04-16 14:52:53 +0000 UTC" firstStartedPulling="2026-04-16 14:57:05.063095471 +0000 UTC m=+252.716477823" lastFinishedPulling="2026-04-16 14:57:06.064805286 +0000 UTC m=+253.718187641" observedRunningTime="2026-04-16 14:57:06.819883129 +0000 UTC m=+254.473265520" watchObservedRunningTime="2026-04-16 14:57:06.820396916 +0000 UTC m=+254.473779291" Apr 16 14:57:06.934321 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:06.934293 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcf4cba-e4bb-420b-910e-252d93e6275a" path="/var/lib/kubelet/pods/8bcf4cba-e4bb-420b-910e-252d93e6275a/volumes" Apr 16 14:57:10.433536 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:10.433504 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:57:52.800873 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:52.795548 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 14:57:52.800873 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:52.796164 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 14:57:52.809215 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:52.809193 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:57:52.809383 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:52.809364 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 14:57:52.812221 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:57:52.812204 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:58:05.433340 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:05.433280 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:58:05.449079 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:05.449056 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:58:05.974436 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:05.974405 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:58:35.890358 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:35.890284 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-js8tx"] Apr 16 14:58:35.893462 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:35.893442 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:35.895693 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:35.895676 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:58:35.899905 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:35.899882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-js8tx"] Apr 16 14:58:36.065257 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.065221 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e6782489-2452-4e2c-9e5c-ff11be2380fe-original-pull-secret\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.065425 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.065320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-kubelet-config\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.065425 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.065401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-dbus\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.166376 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.166298 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-dbus\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.166376 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.166358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e6782489-2452-4e2c-9e5c-ff11be2380fe-original-pull-secret\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.166601 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.166406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-kubelet-config\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.166601 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.166500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-kubelet-config\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.166601 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.166500 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e6782489-2452-4e2c-9e5c-ff11be2380fe-dbus\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.168752 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.168728 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e6782489-2452-4e2c-9e5c-ff11be2380fe-original-pull-secret\") pod \"global-pull-secret-syncer-js8tx\" (UID: \"e6782489-2452-4e2c-9e5c-ff11be2380fe\") " pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.202884 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.202857 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-js8tx" Apr 16 14:58:36.326827 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.326804 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-js8tx"] Apr 16 14:58:36.329584 ip-10-0-137-160 kubenswrapper[2580]: W0416 14:58:36.329552 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6782489_2452_4e2c_9e5c_ff11be2380fe.slice/crio-85a41255f2e501bcdb29314e42abcbb687b5f433fbdacbac1de52f2b210684a8 WatchSource:0}: Error finding container 85a41255f2e501bcdb29314e42abcbb687b5f433fbdacbac1de52f2b210684a8: Status 404 returned error can't find the container with id 85a41255f2e501bcdb29314e42abcbb687b5f433fbdacbac1de52f2b210684a8 Apr 16 14:58:36.331484 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:36.331471 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:58:37.052060 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:37.052009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-js8tx" event={"ID":"e6782489-2452-4e2c-9e5c-ff11be2380fe","Type":"ContainerStarted","Data":"85a41255f2e501bcdb29314e42abcbb687b5f433fbdacbac1de52f2b210684a8"} Apr 16 14:58:40.062972 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:40.062892 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-js8tx" event={"ID":"e6782489-2452-4e2c-9e5c-ff11be2380fe","Type":"ContainerStarted","Data":"61e2216c24867247c157a5b0c40d52971f835a2730359f536523592d851917cb"} Apr 16 14:58:40.076067 ip-10-0-137-160 kubenswrapper[2580]: I0416 14:58:40.076001 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-js8tx" podStartSLOduration=1.609782648 podStartE2EDuration="5.07598463s" podCreationTimestamp="2026-04-16 14:58:35 +0000 UTC" firstStartedPulling="2026-04-16 14:58:36.331591373 +0000 UTC m=+343.984973726" lastFinishedPulling="2026-04-16 14:58:39.797793348 +0000 UTC m=+347.451175708" observedRunningTime="2026-04-16 14:58:40.075157197 +0000 UTC m=+347.728539572" watchObservedRunningTime="2026-04-16 14:58:40.07598463 +0000 UTC m=+347.729367006" Apr 16 15:02:43.122543 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.122506 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fkp94"] Apr 16 15:02:43.125996 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.125978 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.128652 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.128632 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 15:02:43.128760 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.128744 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-v7rz7\"" Apr 16 15:02:43.128862 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.128827 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 15:02:43.134244 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.134224 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fkp94"] Apr 16 15:02:43.286520 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.286491 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4jp\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-kube-api-access-9k4jp\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.286661 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.286541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.387383 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.387302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k4jp\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-kube-api-access-9k4jp\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.387383 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.387367 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.397729 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.397704 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.397874 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.397856 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k4jp\" (UniqueName: \"kubernetes.io/projected/92c13889-13b4-4c5c-a97c-1a673a5bc26e-kube-api-access-9k4jp\") pod \"cert-manager-webhook-597b96b99b-fkp94\" (UID: \"92c13889-13b4-4c5c-a97c-1a673a5bc26e\") " pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.446752 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.446727 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:43.562085 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.562054 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-fkp94"] Apr 16 15:02:43.566179 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:02:43.566141 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92c13889_13b4_4c5c_a97c_1a673a5bc26e.slice/crio-4ede65d218f2fb2bd7a79c4ce094a825a0dc58964c488750adf85e26c206da64 WatchSource:0}: Error finding container 4ede65d218f2fb2bd7a79c4ce094a825a0dc58964c488750adf85e26c206da64: Status 404 returned error can't find the container with id 4ede65d218f2fb2bd7a79c4ce094a825a0dc58964c488750adf85e26c206da64 Apr 16 15:02:43.760676 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:43.760600 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" event={"ID":"92c13889-13b4-4c5c-a97c-1a673a5bc26e","Type":"ContainerStarted","Data":"4ede65d218f2fb2bd7a79c4ce094a825a0dc58964c488750adf85e26c206da64"} Apr 16 15:02:47.773911 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:47.773873 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" event={"ID":"92c13889-13b4-4c5c-a97c-1a673a5bc26e","Type":"ContainerStarted","Data":"87d3fd5367addf2a0a9ccda40afad6cec720f8f9ce4ba23e2aa1b713309d0cc2"} Apr 16 15:02:47.774280 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:47.773934 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:02:47.792098 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:47.792054 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" podStartSLOduration=1.40452595 podStartE2EDuration="4.792041917s" podCreationTimestamp="2026-04-16 15:02:43 +0000 UTC" firstStartedPulling="2026-04-16 15:02:43.568022859 +0000 UTC m=+591.221405211" lastFinishedPulling="2026-04-16 15:02:46.955538827 +0000 UTC m=+594.608921178" observedRunningTime="2026-04-16 15:02:47.790647274 +0000 UTC m=+595.444029649" watchObservedRunningTime="2026-04-16 15:02:47.792041917 +0000 UTC m=+595.445424292" Apr 16 15:02:52.824700 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:52.824672 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:02:52.825132 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:52.824677 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:02:52.831321 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:52.831303 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:02:52.831451 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:52.831348 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:02:53.780211 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:02:53.780183 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-fkp94" Apr 16 15:03:16.636577 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.636544 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n"] Apr 16 15:03:16.638458 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.638442 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.641645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641613 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 15:03:16.641645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641622 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-jtnv8\"" Apr 16 15:03:16.641873 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641648 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 15:03:16.641873 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641693 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 15:03:16.641873 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641622 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 15:03:16.641873 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.641624 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 15:03:16.651667 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.651631 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n"] Apr 16 15:03:16.744973 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.744935 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-manager-config\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.744973 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.744978 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.745164 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.745042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2bs\" (UniqueName: \"kubernetes.io/projected/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-kube-api-access-cm2bs\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.745164 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.745092 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.846387 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.846342 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.846551 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.846413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2bs\" (UniqueName: \"kubernetes.io/projected/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-kube-api-access-cm2bs\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.846551 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.846464 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.846551 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.846513 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-manager-config\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.847203 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.847181 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-manager-config\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.849619 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.849597 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-metrics-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.849696 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.849680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-cert\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.854425 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.854397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2bs\" (UniqueName: \"kubernetes.io/projected/e7415c4a-0b15-4a23-a1f7-024a2d5b2d66-kube-api-access-cm2bs\") pod \"lws-controller-manager-5bfb495c77-8964n\" (UID: \"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66\") " pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:16.947300 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:16.947225 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:17.074550 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:17.074527 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n"] Apr 16 15:03:17.077408 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:03:17.077386 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7415c4a_0b15_4a23_a1f7_024a2d5b2d66.slice/crio-27def729e4d379383e0873a27911c563bced231de70fd408fa6a602795a0a0d8 WatchSource:0}: Error finding container 27def729e4d379383e0873a27911c563bced231de70fd408fa6a602795a0a0d8: Status 404 returned error can't find the container with id 27def729e4d379383e0873a27911c563bced231de70fd408fa6a602795a0a0d8 Apr 16 15:03:17.866012 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:17.865975 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" event={"ID":"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66","Type":"ContainerStarted","Data":"27def729e4d379383e0873a27911c563bced231de70fd408fa6a602795a0a0d8"} Apr 16 15:03:19.874495 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:19.874460 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" event={"ID":"e7415c4a-0b15-4a23-a1f7-024a2d5b2d66","Type":"ContainerStarted","Data":"083112345f7bcca94953886be4d91b778380cc5ae0d62fb9cea693a3dfc78b4f"} Apr 16 15:03:19.874873 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:19.874574 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:03:19.891916 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:19.891873 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" podStartSLOduration=1.4813042109999999 podStartE2EDuration="3.891859333s" podCreationTimestamp="2026-04-16 15:03:16 +0000 UTC" firstStartedPulling="2026-04-16 15:03:17.079826148 +0000 UTC m=+624.733208515" lastFinishedPulling="2026-04-16 15:03:19.490381285 +0000 UTC m=+627.143763637" observedRunningTime="2026-04-16 15:03:19.890199135 +0000 UTC m=+627.543581507" watchObservedRunningTime="2026-04-16 15:03:19.891859333 +0000 UTC m=+627.545241697" Apr 16 15:03:30.879741 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:03:30.879714 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bfb495c77-8964n" Apr 16 15:04:14.855533 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.855505 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg"] Apr 16 15:04:14.858774 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.858753 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:14.863127 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.863106 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 15:04:14.863229 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.863159 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 15:04:14.863289 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.863276 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-bvsfx\"" Apr 16 15:04:14.868396 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.868374 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg"] Apr 16 15:04:14.903106 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:14.903076 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncfv6\" (UniqueName: \"kubernetes.io/projected/5fd777a3-efab-49b1-a04c-c04fc56165de-kube-api-access-ncfv6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hm2tg\" (UID: \"5fd777a3-efab-49b1-a04c-c04fc56165de\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:15.004134 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:15.004105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncfv6\" (UniqueName: \"kubernetes.io/projected/5fd777a3-efab-49b1-a04c-c04fc56165de-kube-api-access-ncfv6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hm2tg\" (UID: \"5fd777a3-efab-49b1-a04c-c04fc56165de\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:15.018806 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:15.018781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncfv6\" (UniqueName: \"kubernetes.io/projected/5fd777a3-efab-49b1-a04c-c04fc56165de-kube-api-access-ncfv6\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hm2tg\" (UID: \"5fd777a3-efab-49b1-a04c-c04fc56165de\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:15.169671 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:15.169603 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:15.294737 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:15.294711 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg"] Apr 16 15:04:15.297460 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:04:15.297433 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd777a3_efab_49b1_a04c_c04fc56165de.slice/crio-f8e0d243e0a1c6208af8b7586f12d077056c4040d7b2561fad67a7cfd40453eb WatchSource:0}: Error finding container f8e0d243e0a1c6208af8b7586f12d077056c4040d7b2561fad67a7cfd40453eb: Status 404 returned error can't find the container with id f8e0d243e0a1c6208af8b7586f12d077056c4040d7b2561fad67a7cfd40453eb Apr 16 15:04:15.299510 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:15.299494 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:04:16.031104 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:16.031066 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" event={"ID":"5fd777a3-efab-49b1-a04c-c04fc56165de","Type":"ContainerStarted","Data":"f8e0d243e0a1c6208af8b7586f12d077056c4040d7b2561fad67a7cfd40453eb"} Apr 16 15:04:18.039960 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:18.039874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" event={"ID":"5fd777a3-efab-49b1-a04c-c04fc56165de","Type":"ContainerStarted","Data":"16b5c2b97496f227acbac3662c64cc7249c8612ea89f8044d441faabe334ec03"} Apr 16 15:04:18.040286 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:18.039975 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:04:18.065046 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:18.064998 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" podStartSLOduration=1.5750066299999999 podStartE2EDuration="4.064986236s" podCreationTimestamp="2026-04-16 15:04:14 +0000 UTC" firstStartedPulling="2026-04-16 15:04:15.299620015 +0000 UTC m=+682.953002367" lastFinishedPulling="2026-04-16 15:04:17.789599621 +0000 UTC m=+685.442981973" observedRunningTime="2026-04-16 15:04:18.063691848 +0000 UTC m=+685.717074222" watchObservedRunningTime="2026-04-16 15:04:18.064986236 +0000 UTC m=+685.718368610" Apr 16 15:04:19.506622 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.506575 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q"] Apr 16 15:04:19.513959 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.513918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.515759 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.515727 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q"] Apr 16 15:04:19.516067 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.516049 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 15:04:19.516155 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.516052 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 15:04:19.516155 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.516101 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2vbh9\"" Apr 16 15:04:19.541891 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.541869 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6mb5\" (UniqueName: \"kubernetes.io/projected/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-kube-api-access-t6mb5\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.542013 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.541913 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.542013 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.541937 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.642880 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.642824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.642984 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.642896 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.642984 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:04:19.642952 2580 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 15:04:19.642984 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.642975 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6mb5\" (UniqueName: \"kubernetes.io/projected/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-kube-api-access-t6mb5\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.643107 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:04:19.643002 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert podName:ab3b5db3-14b3-4d11-9a61-e95b243cb02b nodeName:}" failed. No retries permitted until 2026-04-16 15:04:20.14298775 +0000 UTC m=+687.796370102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-c2x8q" (UID: "ab3b5db3-14b3-4d11-9a61-e95b243cb02b") : secret "plugin-serving-cert" not found Apr 16 15:04:19.643462 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.643445 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:19.653613 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:19.653596 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6mb5\" (UniqueName: \"kubernetes.io/projected/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-kube-api-access-t6mb5\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:20.148124 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:20.148095 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:20.150440 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:20.150416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab3b5db3-14b3-4d11-9a61-e95b243cb02b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-c2x8q\" (UID: \"ab3b5db3-14b3-4d11-9a61-e95b243cb02b\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:20.424532 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:20.424460 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" Apr 16 15:04:20.550328 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:20.550296 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q"] Apr 16 15:04:20.553423 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:04:20.553396 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab3b5db3_14b3_4d11_9a61_e95b243cb02b.slice/crio-679938a6f50a05df41b0444d25c5404f6435ac9d0dbad6ee9f3317dfecd29323 WatchSource:0}: Error finding container 679938a6f50a05df41b0444d25c5404f6435ac9d0dbad6ee9f3317dfecd29323: Status 404 returned error can't find the container with id 679938a6f50a05df41b0444d25c5404f6435ac9d0dbad6ee9f3317dfecd29323 Apr 16 15:04:21.052898 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:21.052830 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" event={"ID":"ab3b5db3-14b3-4d11-9a61-e95b243cb02b","Type":"ContainerStarted","Data":"679938a6f50a05df41b0444d25c5404f6435ac9d0dbad6ee9f3317dfecd29323"} Apr 16 15:04:26.073542 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:26.073509 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" event={"ID":"ab3b5db3-14b3-4d11-9a61-e95b243cb02b","Type":"ContainerStarted","Data":"6cf62834a09015e22f4e0da84f7bae725e7b66e86f555ff935bd8b540782906d"} Apr 16 15:04:26.091929 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:26.091886 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-c2x8q" podStartSLOduration=2.6188366199999997 podStartE2EDuration="7.091871559s" podCreationTimestamp="2026-04-16 15:04:19 +0000 UTC" firstStartedPulling="2026-04-16 15:04:20.555158671 +0000 UTC m=+688.208541023" lastFinishedPulling="2026-04-16 15:04:25.028193607 +0000 UTC m=+692.681575962" observedRunningTime="2026-04-16 15:04:26.090214006 +0000 UTC m=+693.743596389" watchObservedRunningTime="2026-04-16 15:04:26.091871559 +0000 UTC m=+693.745253932" Apr 16 15:04:29.045579 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:04:29.045549 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hm2tg" Apr 16 15:05:03.245965 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.245930 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-fzj2d"] Apr 16 15:05:03.271704 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.271678 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-fzj2d"] Apr 16 15:05:03.271704 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.271704 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-fzj2d"] Apr 16 15:05:03.271940 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.271793 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.274104 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.274075 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 15:05:03.392567 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.392538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-config-file\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.392712 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.392579 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j78sj\" (UniqueName: \"kubernetes.io/projected/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-kube-api-access-j78sj\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.493325 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.493299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j78sj\" (UniqueName: \"kubernetes.io/projected/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-kube-api-access-j78sj\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.493443 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.493361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-config-file\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.493920 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.493903 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-config-file\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.504082 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.504029 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j78sj\" (UniqueName: \"kubernetes.io/projected/4ff8b8dd-6ce4-4a06-8131-0e00309c7be2-kube-api-access-j78sj\") pod \"limitador-limitador-67566c68b4-fzj2d\" (UID: \"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2\") " pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.581743 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.581716 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:03.706396 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:03.706368 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-fzj2d"] Apr 16 15:05:03.708895 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:05:03.708865 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff8b8dd_6ce4_4a06_8131_0e00309c7be2.slice/crio-22e4d5473b65308d6b43e07d34e2830fe735b5654f26c8a118f71e5fe8369885 WatchSource:0}: Error finding container 22e4d5473b65308d6b43e07d34e2830fe735b5654f26c8a118f71e5fe8369885: Status 404 returned error can't find the container with id 22e4d5473b65308d6b43e07d34e2830fe735b5654f26c8a118f71e5fe8369885 Apr 16 15:05:04.198629 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:04.198590 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" event={"ID":"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2","Type":"ContainerStarted","Data":"22e4d5473b65308d6b43e07d34e2830fe735b5654f26c8a118f71e5fe8369885"} Apr 16 15:05:05.203438 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:05.203392 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" event={"ID":"4ff8b8dd-6ce4-4a06-8131-0e00309c7be2","Type":"ContainerStarted","Data":"291981b29acb0999bf5102161aa192244db7e3b9efc0827424bf7498672232fa"} Apr 16 15:05:05.203795 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:05.203461 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:05:05.220753 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:05.220694 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" podStartSLOduration=1.187175935 podStartE2EDuration="2.220683087s" podCreationTimestamp="2026-04-16 15:05:03 +0000 UTC" firstStartedPulling="2026-04-16 15:05:03.710700096 +0000 UTC m=+731.364082448" lastFinishedPulling="2026-04-16 15:05:04.744207248 +0000 UTC m=+732.397589600" observedRunningTime="2026-04-16 15:05:05.218193171 +0000 UTC m=+732.871575545" watchObservedRunningTime="2026-04-16 15:05:05.220683087 +0000 UTC m=+732.874065492" Apr 16 15:05:16.209256 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:05:16.209231 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-fzj2d" Apr 16 15:07:51.639521 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.639490 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:07:51.643125 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.643104 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.647243 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.646808 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:07:51.647243 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.646872 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:07:51.647243 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.646908 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 15:07:51.647243 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.646327 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:07:51.655995 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.655971 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:07:51.717971 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.717941 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.718126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.717981 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.718126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.717998 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.718126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.718014 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.718126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.718061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.718265 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.718155 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdmnv\" (UniqueName: \"kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818543 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818520 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdmnv\" (UniqueName: \"kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818662 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818662 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818620 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818662 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818647 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818804 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818670 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.818804 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.818697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.819081 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.819062 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.819133 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.819098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.819170 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.819104 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.821191 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.821168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.821463 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.821443 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.838119 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.838101 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdmnv\" (UniqueName: \"kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv\") pod \"scheduler-configmap-ref-test-kserve-749946686-xsbqj\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:51.957951 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:51.957902 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:07:52.084759 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.084720 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:07:52.087042 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:07:52.087013 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ba5949b_b104_437d_8e79_6064fa90e473.slice/crio-f3ce0cac949489c153f1f1748a0acc40bce10f080e7ba1c1707c68077974a7e5 WatchSource:0}: Error finding container f3ce0cac949489c153f1f1748a0acc40bce10f080e7ba1c1707c68077974a7e5: Status 404 returned error can't find the container with id f3ce0cac949489c153f1f1748a0acc40bce10f080e7ba1c1707c68077974a7e5 Apr 16 15:07:52.764901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.764862 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerStarted","Data":"f3ce0cac949489c153f1f1748a0acc40bce10f080e7ba1c1707c68077974a7e5"} Apr 16 15:07:52.851354 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.851318 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:07:52.851859 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.851813 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:07:52.859854 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.859820 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:07:52.860539 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:52.860506 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:07:55.776942 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:55.776832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerStarted","Data":"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658"} Apr 16 15:07:59.792599 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:59.792572 2580 generic.go:358] "Generic (PLEG): container finished" podID="4ba5949b-b104-437d-8e79-6064fa90e473" containerID="299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658" exitCode=0 Apr 16 15:07:59.792992 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:07:59.792627 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerDied","Data":"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658"} Apr 16 15:08:01.801864 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:01.801768 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerStarted","Data":"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896"} Apr 16 15:08:01.820965 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:01.820922 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" podStartSLOduration=1.3898974960000001 podStartE2EDuration="10.820908001s" podCreationTimestamp="2026-04-16 15:07:51 +0000 UTC" firstStartedPulling="2026-04-16 15:07:52.088809818 +0000 UTC m=+899.742192170" lastFinishedPulling="2026-04-16 15:08:01.519820319 +0000 UTC m=+909.173202675" observedRunningTime="2026-04-16 15:08:01.818648018 +0000 UTC m=+909.472030592" watchObservedRunningTime="2026-04-16 15:08:01.820908001 +0000 UTC m=+909.474290374" Apr 16 15:08:01.958729 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:01.958705 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:08:01.958914 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:01.958763 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:08:01.971259 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:01.971240 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:08:02.816898 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:08:02.816830 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:09:07.235043 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.234960 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:09:07.235555 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.235246 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="main" containerID="cri-o://cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896" gracePeriod=30 Apr 16 15:09:07.483388 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.483368 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:09:07.632005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.631974 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632184 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632024 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632184 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632043 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdmnv\" (UniqueName: \"kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632184 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632087 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632184 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632132 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632406 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632246 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home\") pod \"4ba5949b-b104-437d-8e79-6064fa90e473\" (UID: \"4ba5949b-b104-437d-8e79-6064fa90e473\") " Apr 16 15:09:07.632406 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632298 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache" (OuterVolumeSpecName: "model-cache") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:07.632537 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632509 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home" (OuterVolumeSpecName: "home") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:07.632595 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.632533 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:07.634437 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.634414 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:09:07.634985 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.634955 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv" (OuterVolumeSpecName: "kube-api-access-qdmnv") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "kube-api-access-qdmnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:09:07.635084 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.635051 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm" (OuterVolumeSpecName: "dshm") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:07.685966 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.685935 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ba5949b-b104-437d-8e79-6064fa90e473" (UID: "4ba5949b-b104-437d-8e79-6064fa90e473"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:07.733928 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.733909 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:07.733928 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.733929 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdmnv\" (UniqueName: \"kubernetes.io/projected/4ba5949b-b104-437d-8e79-6064fa90e473-kube-api-access-qdmnv\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:07.734065 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.733940 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:07.734065 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.733950 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4ba5949b-b104-437d-8e79-6064fa90e473-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:07.734065 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:07.733959 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4ba5949b-b104-437d-8e79-6064fa90e473-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:08.010910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.010794 2580 generic.go:358] "Generic (PLEG): container finished" podID="4ba5949b-b104-437d-8e79-6064fa90e473" containerID="cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896" exitCode=0 Apr 16 15:09:08.010910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.010825 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerDied","Data":"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896"} Apr 16 15:09:08.010910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.010882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" event={"ID":"4ba5949b-b104-437d-8e79-6064fa90e473","Type":"ContainerDied","Data":"f3ce0cac949489c153f1f1748a0acc40bce10f080e7ba1c1707c68077974a7e5"} Apr 16 15:09:08.010910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.010897 2580 scope.go:117] "RemoveContainer" containerID="cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896" Apr 16 15:09:08.010910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.010899 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj" Apr 16 15:09:08.019622 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.019601 2580 scope.go:117] "RemoveContainer" containerID="299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658" Apr 16 15:09:08.031736 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.031707 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:09:08.032808 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.032784 2580 scope.go:117] "RemoveContainer" containerID="cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896" Apr 16 15:09:08.033078 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:09:08.033042 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896\": container with ID starting with cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896 not found: ID does not exist" containerID="cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896" Apr 16 15:09:08.033181 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.033079 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896"} err="failed to get container status \"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896\": rpc error: code = NotFound desc = could not find container \"cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896\": container with ID starting with cad479263af13499310933002bcd09520ffa326705b2aa897da3e6df1a34d896 not found: ID does not exist" Apr 16 15:09:08.033181 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.033106 2580 scope.go:117] "RemoveContainer" containerID="299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658" Apr 16 15:09:08.033354 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:09:08.033331 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658\": container with ID starting with 299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658 not found: ID does not exist" containerID="299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658" Apr 16 15:09:08.033508 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.033360 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658"} err="failed to get container status \"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658\": rpc error: code = NotFound desc = could not find container \"299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658\": container with ID starting with 299ee39c3b25332f580285523397f73225b8bc1a680c1822b4b9d55387193658 not found: ID does not exist" Apr 16 15:09:08.035263 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.035242 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-749946686-xsbqj"] Apr 16 15:09:08.935519 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:08.935489 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" path="/var/lib/kubelet/pods/4ba5949b-b104-437d-8e79-6064fa90e473/volumes" Apr 16 15:09:25.249298 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249261 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:09:25.249762 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249741 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="storage-initializer" Apr 16 15:09:25.249895 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249765 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="storage-initializer" Apr 16 15:09:25.249895 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249814 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="main" Apr 16 15:09:25.249895 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249824 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="main" Apr 16 15:09:25.250013 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.249929 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ba5949b-b104-437d-8e79-6064fa90e473" containerName="main" Apr 16 15:09:25.255209 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.255187 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.258243 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.258223 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:09:25.258368 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.258222 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:09:25.258368 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.258222 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 15:09:25.258368 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.258271 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:09:25.263400 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.263381 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:09:25.366420 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366388 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.366420 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.366607 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.366607 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366509 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxc7l\" (UniqueName: \"kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.366607 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366590 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.366708 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.366626 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.467670 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.467640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxc7l\" (UniqueName: \"kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.467871 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.467824 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.467965 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.467912 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468068 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.467998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468068 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.468030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468178 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.468073 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468268 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.468248 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468324 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.468288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.468422 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.468384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.470655 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.470629 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.470953 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.470937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.489193 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.489168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxc7l\" (UniqueName: \"kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l\") pod \"scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.567032 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.566938 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:25.690989 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.690959 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:09:25.693655 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:09:25.693610 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeffc2d54_635f_4412_8f06_cbcf357dcee7.slice/crio-6de04f4406bc1761f7247aa987be821b7e89f35ecb9a795e37701d9ddf265096 WatchSource:0}: Error finding container 6de04f4406bc1761f7247aa987be821b7e89f35ecb9a795e37701d9ddf265096: Status 404 returned error can't find the container with id 6de04f4406bc1761f7247aa987be821b7e89f35ecb9a795e37701d9ddf265096 Apr 16 15:09:25.695448 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:25.695434 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:09:26.067769 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:26.067730 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerStarted","Data":"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28"} Apr 16 15:09:26.067769 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:26.067772 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerStarted","Data":"6de04f4406bc1761f7247aa987be821b7e89f35ecb9a795e37701d9ddf265096"} Apr 16 15:09:30.082869 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:30.082768 2580 generic.go:358] "Generic (PLEG): container finished" podID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerID="509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28" exitCode=0 Apr 16 15:09:30.083196 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:30.082858 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerDied","Data":"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28"} Apr 16 15:09:31.088678 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:31.088645 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerStarted","Data":"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88"} Apr 16 15:09:31.109135 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:31.109082 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" podStartSLOduration=6.109064848 podStartE2EDuration="6.109064848s" podCreationTimestamp="2026-04-16 15:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:09:31.107643019 +0000 UTC m=+998.761025393" watchObservedRunningTime="2026-04-16 15:09:31.109064848 +0000 UTC m=+998.762447226" Apr 16 15:09:35.567203 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:35.567167 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:35.567654 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:35.567212 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:35.579722 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:35.579701 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:36.118772 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:36.118746 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:58.728016 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:58.727982 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:09:58.728529 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:58.728378 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="main" containerID="cri-o://c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88" gracePeriod=30 Apr 16 15:09:58.976944 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:58.976922 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:59.033541 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033480 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033541 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033530 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxc7l\" (UniqueName: \"kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033727 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033557 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033727 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033697 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033890 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033749 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033890 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033780 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm\") pod \"effc2d54-635f-4412-8f06-cbcf357dcee7\" (UID: \"effc2d54-635f-4412-8f06-cbcf357dcee7\") " Apr 16 15:09:59.033890 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.033787 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home" (OuterVolumeSpecName: "home") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.034063 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.034038 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache" (OuterVolumeSpecName: "model-cache") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.034145 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.034121 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.034268 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.034150 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.035785 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.035758 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l" (OuterVolumeSpecName: "kube-api-access-hxc7l") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "kube-api-access-hxc7l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:09:59.035926 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.035904 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm" (OuterVolumeSpecName: "dshm") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.035994 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.035979 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:09:59.095791 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.095763 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "effc2d54-635f-4412-8f06-cbcf357dcee7" (UID: "effc2d54-635f-4412-8f06-cbcf357dcee7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:09:59.135330 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.135311 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/effc2d54-635f-4412-8f06-cbcf357dcee7-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.135330 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.135331 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.135466 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.135341 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxc7l\" (UniqueName: \"kubernetes.io/projected/effc2d54-635f-4412-8f06-cbcf357dcee7-kube-api-access-hxc7l\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.135466 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.135351 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/effc2d54-635f-4412-8f06-cbcf357dcee7-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:09:59.184429 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.184402 2580 generic.go:358] "Generic (PLEG): container finished" podID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerID="c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88" exitCode=0 Apr 16 15:09:59.184546 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.184447 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerDied","Data":"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88"} Apr 16 15:09:59.184546 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.184476 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" Apr 16 15:09:59.184546 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.184495 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj" event={"ID":"effc2d54-635f-4412-8f06-cbcf357dcee7","Type":"ContainerDied","Data":"6de04f4406bc1761f7247aa987be821b7e89f35ecb9a795e37701d9ddf265096"} Apr 16 15:09:59.184546 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.184519 2580 scope.go:117] "RemoveContainer" containerID="c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88" Apr 16 15:09:59.193433 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.193413 2580 scope.go:117] "RemoveContainer" containerID="509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28" Apr 16 15:09:59.202414 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.202395 2580 scope.go:117] "RemoveContainer" containerID="c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88" Apr 16 15:09:59.202884 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:09:59.202832 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88\": container with ID starting with c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88 not found: ID does not exist" containerID="c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88" Apr 16 15:09:59.203001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.202895 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88"} err="failed to get container status \"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88\": rpc error: code = NotFound desc = could not find container \"c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88\": container with ID starting with c01ca8a5ac67041ea2aa368ebcb98d0ed07c823e27d8ed934d589e807f67de88 not found: ID does not exist" Apr 16 15:09:59.203001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.202958 2580 scope.go:117] "RemoveContainer" containerID="509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28" Apr 16 15:09:59.203274 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:09:59.203230 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28\": container with ID starting with 509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28 not found: ID does not exist" containerID="509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28" Apr 16 15:09:59.203369 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.203268 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28"} err="failed to get container status \"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28\": rpc error: code = NotFound desc = could not find container \"509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28\": container with ID starting with 509226f8cb94f6adbcc288b9089dc16337664f54f4e1e594937fcf328034dc28 not found: ID does not exist" Apr 16 15:09:59.204763 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.204746 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:09:59.209002 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:09:59.208982 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-765dc7c466-t9tvj"] Apr 16 15:10:00.934998 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:00.934968 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" path="/var/lib/kubelet/pods/effc2d54-635f-4412-8f06-cbcf357dcee7/volumes" Apr 16 15:10:12.256129 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256090 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:12.256574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256400 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="storage-initializer" Apr 16 15:10:12.256574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256411 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="storage-initializer" Apr 16 15:10:12.256574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256428 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="main" Apr 16 15:10:12.256574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256434 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="main" Apr 16 15:10:12.256574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.256489 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="effc2d54-635f-4412-8f06-cbcf357dcee7" containerName="main" Apr 16 15:10:12.259583 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.259563 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.263386 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.263179 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:10:12.263386 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.263230 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:10:12.263386 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.263265 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 15:10:12.263386 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.263191 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:10:12.270506 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.270483 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:12.337351 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337320 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.337509 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337358 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.337509 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.337509 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.337509 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337450 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9mjb\" (UniqueName: \"kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.337509 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.337485 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438268 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438234 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438416 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438277 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9mjb\" (UniqueName: \"kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438416 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438308 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438416 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438360 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438416 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438388 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438416 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438410 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438773 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438748 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438773 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438765 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.438975 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.438800 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.440652 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.440625 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.440916 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.440897 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.447422 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.447397 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9mjb\" (UniqueName: \"kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb\") pod \"precise-prefix-cache-test-kserve-57947565d-dm9hj\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.574141 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.574070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:12.692070 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:12.692044 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:12.694737 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:10:12.694713 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75dc1f48_0af5_4db4_8a2b_18d2f5bc14d2.slice/crio-cd1d2990b9001d41aaa48186ea05a6306528e0bcca25cc6886cc79815f21b0d1 WatchSource:0}: Error finding container cd1d2990b9001d41aaa48186ea05a6306528e0bcca25cc6886cc79815f21b0d1: Status 404 returned error can't find the container with id cd1d2990b9001d41aaa48186ea05a6306528e0bcca25cc6886cc79815f21b0d1 Apr 16 15:10:13.231443 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:13.231402 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerStarted","Data":"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820"} Apr 16 15:10:13.231443 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:13.231448 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerStarted","Data":"cd1d2990b9001d41aaa48186ea05a6306528e0bcca25cc6886cc79815f21b0d1"} Apr 16 15:10:17.246387 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:17.246356 2580 generic.go:358] "Generic (PLEG): container finished" podID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerID="fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820" exitCode=0 Apr 16 15:10:17.246877 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:17.246431 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerDied","Data":"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820"} Apr 16 15:10:18.251410 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:18.251377 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerStarted","Data":"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831"} Apr 16 15:10:18.269119 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:18.269043 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" podStartSLOduration=6.26902902 podStartE2EDuration="6.26902902s" podCreationTimestamp="2026-04-16 15:10:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:10:18.268462827 +0000 UTC m=+1045.921845203" watchObservedRunningTime="2026-04-16 15:10:18.26902902 +0000 UTC m=+1045.922411464" Apr 16 15:10:22.574820 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:22.574773 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:22.575205 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:22.574927 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:22.587386 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:22.587366 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:23.278138 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:23.278114 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:57.886673 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:57.886641 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:57.887190 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:57.886974 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="main" containerID="cri-o://efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831" gracePeriod=30 Apr 16 15:10:58.138941 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.138887 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:58.210477 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210448 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9mjb\" (UniqueName: \"kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.210623 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210482 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.210623 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210503 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.210745 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210631 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.210745 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210667 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.210745 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.210715 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home\") pod \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\" (UID: \"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2\") " Apr 16 15:10:58.211050 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.211019 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache" (OuterVolumeSpecName: "model-cache") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.211134 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.211064 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home" (OuterVolumeSpecName: "home") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.212925 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.212895 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb" (OuterVolumeSpecName: "kube-api-access-h9mjb") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "kube-api-access-h9mjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:10:58.212925 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.212909 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:10:58.213054 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.212922 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm" (OuterVolumeSpecName: "dshm") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.268999 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.268965 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" (UID: "75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:10:58.311478 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311457 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9mjb\" (UniqueName: \"kubernetes.io/projected/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kube-api-access-h9mjb\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.311478 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311478 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.311613 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311488 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.311613 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311497 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.311613 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311525 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.311613 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.311534 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:10:58.384717 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.384687 2580 generic.go:358] "Generic (PLEG): container finished" podID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerID="efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831" exitCode=0 Apr 16 15:10:58.384859 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.384767 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerDied","Data":"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831"} Apr 16 15:10:58.384859 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.384787 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" Apr 16 15:10:58.384944 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.384861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj" event={"ID":"75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2","Type":"ContainerDied","Data":"cd1d2990b9001d41aaa48186ea05a6306528e0bcca25cc6886cc79815f21b0d1"} Apr 16 15:10:58.384944 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.384885 2580 scope.go:117] "RemoveContainer" containerID="efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831" Apr 16 15:10:58.395074 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.395053 2580 scope.go:117] "RemoveContainer" containerID="fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820" Apr 16 15:10:58.407490 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.407469 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:58.410450 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.410411 2580 scope.go:117] "RemoveContainer" containerID="efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831" Apr 16 15:10:58.410680 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:10:58.410659 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831\": container with ID starting with efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831 not found: ID does not exist" containerID="efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831" Apr 16 15:10:58.410739 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.410688 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831"} err="failed to get container status \"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831\": rpc error: code = NotFound desc = could not find container \"efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831\": container with ID starting with efda0d019952585b13ea9fb1c11658c3b605df67eff4a80f60cf2efc2d3b9831 not found: ID does not exist" Apr 16 15:10:58.410739 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.410709 2580 scope.go:117] "RemoveContainer" containerID="fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820" Apr 16 15:10:58.411069 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:10:58.411044 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820\": container with ID starting with fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820 not found: ID does not exist" containerID="fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820" Apr 16 15:10:58.411155 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.411078 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820"} err="failed to get container status \"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820\": rpc error: code = NotFound desc = could not find container \"fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820\": container with ID starting with fd2cc0335717e9c5cca5a315f3f63ac3e8a22311aaaa4eb8343cd5bd1a373820 not found: ID does not exist" Apr 16 15:10:58.411648 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.411629 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-57947565d-dm9hj"] Apr 16 15:10:58.935576 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:10:58.935549 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" path="/var/lib/kubelet/pods/75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2/volumes" Apr 16 15:12:52.876515 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:12:52.876484 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:12:52.878810 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:12:52.878788 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:12:52.883334 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:12:52.883301 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:12:52.885077 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:12:52.885062 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:15:08.965545 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965473 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5c55c5c99-b46jq"] Apr 16 15:15:08.966005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965782 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="storage-initializer" Apr 16 15:15:08.966005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965793 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="storage-initializer" Apr 16 15:15:08.966005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965801 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="main" Apr 16 15:15:08.966005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965807 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="main" Apr 16 15:15:08.966005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.965878 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="75dc1f48-0af5-4db4-8a2b-18d2f5bc14d2" containerName="main" Apr 16 15:15:08.968653 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.968638 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:08.971308 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.971288 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 15:15:08.971975 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.971950 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-9txrd\"" Apr 16 15:15:08.971975 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.971970 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 15:15:08.972116 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.972097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 15:15:08.976996 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:08.976976 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5c55c5c99-b46jq"] Apr 16 15:15:09.052462 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.052439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-972c9\" (UniqueName: \"kubernetes.io/projected/9488bf4c-5b0d-46e2-ba34-04456010e7af-kube-api-access-972c9\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.052582 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.052531 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9488bf4c-5b0d-46e2-ba34-04456010e7af-cert\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.152972 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.152943 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9488bf4c-5b0d-46e2-ba34-04456010e7af-cert\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.153102 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.152995 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-972c9\" (UniqueName: \"kubernetes.io/projected/9488bf4c-5b0d-46e2-ba34-04456010e7af-kube-api-access-972c9\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.155164 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.155142 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9488bf4c-5b0d-46e2-ba34-04456010e7af-cert\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.160529 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.160509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-972c9\" (UniqueName: \"kubernetes.io/projected/9488bf4c-5b0d-46e2-ba34-04456010e7af-kube-api-access-972c9\") pod \"llmisvc-controller-manager-5c55c5c99-b46jq\" (UID: \"9488bf4c-5b0d-46e2-ba34-04456010e7af\") " pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.278809 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.278734 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:09.397748 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.397718 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5c55c5c99-b46jq"] Apr 16 15:15:09.401535 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:15:09.401501 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9488bf4c_5b0d_46e2_ba34_04456010e7af.slice/crio-221599482bde6ce8c42328936d0acf5c465124799fb0814352c0c5e9ef8966b2 WatchSource:0}: Error finding container 221599482bde6ce8c42328936d0acf5c465124799fb0814352c0c5e9ef8966b2: Status 404 returned error can't find the container with id 221599482bde6ce8c42328936d0acf5c465124799fb0814352c0c5e9ef8966b2 Apr 16 15:15:09.402994 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:09.402977 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:15:10.200892 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:10.200863 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" event={"ID":"9488bf4c-5b0d-46e2-ba34-04456010e7af","Type":"ContainerStarted","Data":"221599482bde6ce8c42328936d0acf5c465124799fb0814352c0c5e9ef8966b2"} Apr 16 15:15:13.212044 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:13.211974 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" event={"ID":"9488bf4c-5b0d-46e2-ba34-04456010e7af","Type":"ContainerStarted","Data":"f0461aa73a9e625a6c72877e000ab80358b25dfcb634dd56ec5b731ca4f64096"} Apr 16 15:15:13.212385 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:13.212075 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:15:13.228145 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:13.228103 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" podStartSLOduration=1.776108665 podStartE2EDuration="5.228088195s" podCreationTimestamp="2026-04-16 15:15:08 +0000 UTC" firstStartedPulling="2026-04-16 15:15:09.403097412 +0000 UTC m=+1337.056479766" lastFinishedPulling="2026-04-16 15:15:12.855076935 +0000 UTC m=+1340.508459296" observedRunningTime="2026-04-16 15:15:13.226962918 +0000 UTC m=+1340.880345315" watchObservedRunningTime="2026-04-16 15:15:13.228088195 +0000 UTC m=+1340.881470569" Apr 16 15:15:44.217879 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:15:44.217826 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5c55c5c99-b46jq" Apr 16 15:17:52.899889 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:17:52.899864 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:17:52.901974 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:17:52.901950 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:17:52.906296 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:17:52.906279 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:17:52.908517 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:17:52.908500 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:20:27.159455 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.159420 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:20:27.161806 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.161785 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.164126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.164102 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:20:27.164255 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.164233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 15:20:27.164255 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.164251 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:20:27.165019 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.165005 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:20:27.165112 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.165052 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-75nnz\"" Apr 16 15:20:27.174497 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.174475 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:20:27.273276 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273244 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.273412 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.273412 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273314 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.273412 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273361 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7grss\" (UniqueName: \"kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.273550 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273420 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.273550 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.273445 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.373992 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.373964 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.374152 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.374015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.374152 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.374051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.374152 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.374105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.374152 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.374131 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.374380 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.374172 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7grss\" (UniqueName: \"kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.375665 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.375003 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.375665 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.375188 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.375665 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.375355 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.378750 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.378724 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.378950 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.378928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.381186 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.381165 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7grss\" (UniqueName: \"kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.473756 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.473687 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:20:27.595834 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.595810 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:20:27.598619 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:20:27.598591 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc725c2_f6f6_45c9_bb01_e19731918825.slice/crio-639bc4beb83ed9aa992b123a38fc44353170e9d5a440416c06953503c300cce8 WatchSource:0}: Error finding container 639bc4beb83ed9aa992b123a38fc44353170e9d5a440416c06953503c300cce8: Status 404 returned error can't find the container with id 639bc4beb83ed9aa992b123a38fc44353170e9d5a440416c06953503c300cce8 Apr 16 15:20:27.600574 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:27.600557 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:20:28.255178 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:28.255144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerStarted","Data":"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0"} Apr 16 15:20:28.255178 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:28.255185 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerStarted","Data":"639bc4beb83ed9aa992b123a38fc44353170e9d5a440416c06953503c300cce8"} Apr 16 15:20:32.271712 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:32.271631 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerID="4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0" exitCode=0 Apr 16 15:20:32.272053 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:20:32.271706 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerDied","Data":"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0"} Apr 16 15:21:20.453468 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:21:20.453434 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerStarted","Data":"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42"} Apr 16 15:21:20.470297 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:21:20.470231 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=5.6918321469999995 podStartE2EDuration="53.470217052s" podCreationTimestamp="2026-04-16 15:20:27 +0000 UTC" firstStartedPulling="2026-04-16 15:20:32.2728727 +0000 UTC m=+1659.926255053" lastFinishedPulling="2026-04-16 15:21:20.051257599 +0000 UTC m=+1707.704639958" observedRunningTime="2026-04-16 15:21:20.469241719 +0000 UTC m=+1708.122624103" watchObservedRunningTime="2026-04-16 15:21:20.470217052 +0000 UTC m=+1708.123599470" Apr 16 15:22:52.923877 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:22:52.923831 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:22:52.926315 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:22:52.926295 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:22:52.930119 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:22:52.930101 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:22:52.933146 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:22:52.933127 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:23:00.385126 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:00.385089 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:23:00.385585 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:00.385353 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="main" containerID="cri-o://9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42" gracePeriod=30 Apr 16 15:23:01.222297 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.222267 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:23:01.328901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328800 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.328901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328866 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7grss\" (UniqueName: \"kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.328901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328887 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.329527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328910 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.329527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328930 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.329527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.328959 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache\") pod \"8bc725c2-f6f6-45c9-bb01-e19731918825\" (UID: \"8bc725c2-f6f6-45c9-bb01-e19731918825\") " Apr 16 15:23:01.329527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.329275 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache" (OuterVolumeSpecName: "model-cache") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:01.329527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.329295 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home" (OuterVolumeSpecName: "home") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:01.331113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.331082 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:23:01.331259 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.331240 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss" (OuterVolumeSpecName: "kube-api-access-7grss") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "kube-api-access-7grss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:23:01.331318 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.331277 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm" (OuterVolumeSpecName: "dshm") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:01.383427 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.383398 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bc725c2-f6f6-45c9-bb01-e19731918825" (UID: "8bc725c2-f6f6-45c9-bb01-e19731918825"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:23:01.429487 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429468 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.429487 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429489 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.429811 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429498 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.429811 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429508 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8bc725c2-f6f6-45c9-bb01-e19731918825-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.429811 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429518 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7grss\" (UniqueName: \"kubernetes.io/projected/8bc725c2-f6f6-45c9-bb01-e19731918825-kube-api-access-7grss\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.429811 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.429527 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8bc725c2-f6f6-45c9-bb01-e19731918825-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:23:01.816967 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.816928 2580 generic.go:358] "Generic (PLEG): container finished" podID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerID="9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42" exitCode=0 Apr 16 15:23:01.817116 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.816993 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 15:23:01.817116 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.817010 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerDied","Data":"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42"} Apr 16 15:23:01.817116 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.817050 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8bc725c2-f6f6-45c9-bb01-e19731918825","Type":"ContainerDied","Data":"639bc4beb83ed9aa992b123a38fc44353170e9d5a440416c06953503c300cce8"} Apr 16 15:23:01.817116 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.817065 2580 scope.go:117] "RemoveContainer" containerID="9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42" Apr 16 15:23:01.836148 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.836131 2580 scope.go:117] "RemoveContainer" containerID="4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0" Apr 16 15:23:01.838992 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.838972 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:23:01.842795 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.842770 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 15:23:01.846886 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.846873 2580 scope.go:117] "RemoveContainer" containerID="9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42" Apr 16 15:23:01.847109 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:23:01.847093 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42\": container with ID starting with 9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42 not found: ID does not exist" containerID="9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42" Apr 16 15:23:01.847158 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.847115 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42"} err="failed to get container status \"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42\": rpc error: code = NotFound desc = could not find container \"9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42\": container with ID starting with 9983d9985497c92dd07ec8ee3e6afda1fa521c9791805d03d5f1b564c756fb42 not found: ID does not exist" Apr 16 15:23:01.847158 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.847130 2580 scope.go:117] "RemoveContainer" containerID="4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0" Apr 16 15:23:01.847378 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:23:01.847352 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0\": container with ID starting with 4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0 not found: ID does not exist" containerID="4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0" Apr 16 15:23:01.847432 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:01.847377 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0"} err="failed to get container status \"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0\": rpc error: code = NotFound desc = could not find container \"4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0\": container with ID starting with 4665a4094fa0fcd3b847dfa91da3f09847e0e052d9eeb230e5af08384b7afea0 not found: ID does not exist" Apr 16 15:23:02.934693 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:02.934659 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" path="/var/lib/kubelet/pods/8bc725c2-f6f6-45c9-bb01-e19731918825/volumes" Apr 16 15:23:04.045922 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.045892 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5"] Apr 16 15:23:04.046270 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.046218 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="main" Apr 16 15:23:04.046270 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.046229 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="main" Apr 16 15:23:04.046270 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.046237 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="storage-initializer" Apr 16 15:23:04.046270 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.046244 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="storage-initializer" Apr 16 15:23:04.046392 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.046314 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bc725c2-f6f6-45c9-bb01-e19731918825" containerName="main" Apr 16 15:23:04.050962 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.050940 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.053593 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.053543 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 15:23:04.053735 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.053574 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-vxnb6\"" Apr 16 15:23:04.053735 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.053643 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 15:23:04.053735 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.053689 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 15:23:04.063607 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.063585 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5"] Apr 16 15:23:04.151657 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151657 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151675 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151705 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151823 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2745d706-4614-4168-b260-c4d530c239c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.151904 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151862 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2745d706-4614-4168-b260-c4d530c239c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.152148 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.151911 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmft4\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-kube-api-access-qmft4\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252535 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmft4\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-kube-api-access-qmft4\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252699 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252548 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252699 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252699 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252814 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252741 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252814 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252803 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252926 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252910 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252980 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252941 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2745d706-4614-4168-b260-c4d530c239c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.252980 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.252974 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2745d706-4614-4168-b260-c4d530c239c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.253104 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.253080 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.253141 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.253113 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.253901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.253873 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2745d706-4614-4168-b260-c4d530c239c2-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.254078 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.254060 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.254231 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.254206 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.255090 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.255068 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2745d706-4614-4168-b260-c4d530c239c2-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.255514 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.255496 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2745d706-4614-4168-b260-c4d530c239c2-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.260308 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.260284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmft4\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-kube-api-access-qmft4\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.260390 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.260351 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2745d706-4614-4168-b260-c4d530c239c2-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-n92d5\" (UID: \"2745d706-4614-4168-b260-c4d530c239c2\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.361831 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.361799 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:04.489192 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.489167 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5"] Apr 16 15:23:04.491703 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:23:04.491673 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2745d706_4614_4168_b260_c4d530c239c2.slice/crio-4e144cc1bfaa13b5d8787965e08e669259e31d30f3aa2f7b99dbac81b41bb603 WatchSource:0}: Error finding container 4e144cc1bfaa13b5d8787965e08e669259e31d30f3aa2f7b99dbac81b41bb603: Status 404 returned error can't find the container with id 4e144cc1bfaa13b5d8787965e08e669259e31d30f3aa2f7b99dbac81b41bb603 Apr 16 15:23:04.827806 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:04.827720 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" event={"ID":"2745d706-4614-4168-b260-c4d530c239c2","Type":"ContainerStarted","Data":"4e144cc1bfaa13b5d8787965e08e669259e31d30f3aa2f7b99dbac81b41bb603"} Apr 16 15:23:06.824814 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:06.824777 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 15:23:06.825111 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:06.824867 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 15:23:06.825111 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:06.824898 2580 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 15:23:07.839159 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:07.839121 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" event={"ID":"2745d706-4614-4168-b260-c4d530c239c2","Type":"ContainerStarted","Data":"6a263c9745dc533d0836ffa5b4c2c66c649c5e4d3151105a7d9a021c8ed912a0"} Apr 16 15:23:07.860531 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:07.860483 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" podStartSLOduration=1.5293617579999998 podStartE2EDuration="3.860470005s" podCreationTimestamp="2026-04-16 15:23:04 +0000 UTC" firstStartedPulling="2026-04-16 15:23:04.493430387 +0000 UTC m=+1812.146812739" lastFinishedPulling="2026-04-16 15:23:06.824538635 +0000 UTC m=+1814.477920986" observedRunningTime="2026-04-16 15:23:07.859606156 +0000 UTC m=+1815.512988554" watchObservedRunningTime="2026-04-16 15:23:07.860470005 +0000 UTC m=+1815.513852378" Apr 16 15:23:08.362309 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:08.362274 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:08.363726 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:08.363676 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" podUID="2745d706-4614-4168-b260-c4d530c239c2" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.33:15021/healthz/ready\": dial tcp 10.134.0.33:15021: connect: connection refused" Apr 16 15:23:09.362555 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:09.362510 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" podUID="2745d706-4614-4168-b260-c4d530c239c2" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.134.0.33:15021/healthz/ready\": dial tcp 10.134.0.33:15021: connect: connection refused" Apr 16 15:23:10.366192 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:10.366116 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:10.854453 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:10.854429 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:10.855414 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:10.855396 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-n92d5" Apr 16 15:23:11.819360 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.819329 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:23:11.822734 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.822717 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.825084 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.825063 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 15:23:11.825795 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.825773 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dmzth\"" Apr 16 15:23:11.833453 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.833432 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:23:11.931030 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931000 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.931175 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931033 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.931175 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.931175 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931088 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.931175 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931127 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:11.931175 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:11.931166 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fww2m\" (UniqueName: \"kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.031995 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.031965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032123 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.031999 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032123 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032064 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032123 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032091 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fww2m\" (UniqueName: \"kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032288 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032174 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032288 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032203 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032411 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032391 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032471 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032452 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.032585 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.032565 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.034400 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.034382 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.034655 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.034637 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.039715 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.039692 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fww2m\" (UniqueName: \"kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m\") pod \"scheduler-inline-config-test-kserve-7d994b55cd-f6rm4\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.133827 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.133797 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:12.259366 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.259341 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:23:12.261599 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:23:12.261572 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda01539b4_0e53_4ab0_b751_4d76e2138fff.slice/crio-06cf864a8636b76b5e45d8a09acc6bf3b1e117f48297d84e159154fc088a8227 WatchSource:0}: Error finding container 06cf864a8636b76b5e45d8a09acc6bf3b1e117f48297d84e159154fc088a8227: Status 404 returned error can't find the container with id 06cf864a8636b76b5e45d8a09acc6bf3b1e117f48297d84e159154fc088a8227 Apr 16 15:23:12.862815 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.862780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerStarted","Data":"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6"} Apr 16 15:23:12.862815 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:12.862818 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerStarted","Data":"06cf864a8636b76b5e45d8a09acc6bf3b1e117f48297d84e159154fc088a8227"} Apr 16 15:23:16.877199 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:16.877163 2580 generic.go:358] "Generic (PLEG): container finished" podID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerID="0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6" exitCode=0 Apr 16 15:23:16.877558 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:16.877215 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerDied","Data":"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6"} Apr 16 15:23:17.882060 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:17.882027 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerStarted","Data":"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034"} Apr 16 15:23:17.902897 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:17.902825 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" podStartSLOduration=6.9028117810000005 podStartE2EDuration="6.902811781s" podCreationTimestamp="2026-04-16 15:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:23:17.901232938 +0000 UTC m=+1825.554615330" watchObservedRunningTime="2026-04-16 15:23:17.902811781 +0000 UTC m=+1825.556194155" Apr 16 15:23:22.134616 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:22.134587 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:22.135029 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:22.134626 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:22.147504 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:22.147484 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:22.910701 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:22.910667 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:23:38.452148 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.452116 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:23:38.458185 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.458165 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.460379 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.460361 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-f7xzd\"" Apr 16 15:23:38.460456 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.460390 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 15:23:38.466491 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.466468 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:23:38.560432 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560398 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjldv\" (UniqueName: \"kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.560576 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.560576 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.560576 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560525 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.560576 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.560708 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.560599 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661614 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661614 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661616 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661884 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661884 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661683 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661884 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.661884 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.661819 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjldv\" (UniqueName: \"kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.662158 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.662134 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.662211 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.662147 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.662264 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.662229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.664118 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.664097 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.664395 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.664378 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.669143 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.669126 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjldv\" (UniqueName: \"kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv\") pod \"router-with-refs-pd-test-kserve-86d9dd65cc-h96dg\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.768604 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.768533 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:38.898392 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.898367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:23:38.900625 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:23:38.900596 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d20d917_d834_4255_be95_1c2f55a8017a.slice/crio-c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039 WatchSource:0}: Error finding container c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039: Status 404 returned error can't find the container with id c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039 Apr 16 15:23:38.952176 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:38.952150 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerStarted","Data":"c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039"} Apr 16 15:23:39.956905 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:39.956879 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerStarted","Data":"550399b15cc16945adc38d13de6a66d8e792ec359e792136a2f40bde2efbee8b"} Apr 16 15:23:39.957195 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:39.957006 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:40.963092 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:40.963055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerStarted","Data":"53192086e763ee094b78e954682e0534df100aa2f22160c3ad5386e238eb737c"} Apr 16 15:23:43.974952 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:43.974921 2580 generic.go:358] "Generic (PLEG): container finished" podID="8d20d917-d834-4255-be95-1c2f55a8017a" containerID="53192086e763ee094b78e954682e0534df100aa2f22160c3ad5386e238eb737c" exitCode=0 Apr 16 15:23:43.975319 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:43.974961 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerDied","Data":"53192086e763ee094b78e954682e0534df100aa2f22160c3ad5386e238eb737c"} Apr 16 15:23:44.980297 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:44.980264 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerStarted","Data":"460d21608b0a2ba41ad6ad48be0e41ece1badff4ec077883d7875a95a509860e"} Apr 16 15:23:45.003579 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:45.003517 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podStartSLOduration=6.007774829 podStartE2EDuration="7.003499966s" podCreationTimestamp="2026-04-16 15:23:38 +0000 UTC" firstStartedPulling="2026-04-16 15:23:38.902721546 +0000 UTC m=+1846.556103898" lastFinishedPulling="2026-04-16 15:23:39.898446679 +0000 UTC m=+1847.551829035" observedRunningTime="2026-04-16 15:23:45.002520761 +0000 UTC m=+1852.655903160" watchObservedRunningTime="2026-04-16 15:23:45.003499966 +0000 UTC m=+1852.656882340" Apr 16 15:23:48.769550 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:48.769511 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:48.769550 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:48.769547 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:23:48.770907 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:48.770876 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:23:58.769715 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:58.769660 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:23:58.781716 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:23:58.781685 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:24:08.768977 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:08.768932 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:24:12.725336 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:12.725301 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:24:12.725875 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:12.725656 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="main" containerID="cri-o://3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034" gracePeriod=30 Apr 16 15:24:12.978259 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:12.978201 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:24:13.077075 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077041 2580 generic.go:358] "Generic (PLEG): container finished" podID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerID="3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034" exitCode=0 Apr 16 15:24:13.077251 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077140 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerDied","Data":"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034"} Apr 16 15:24:13.077251 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077192 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" event={"ID":"a01539b4-0e53-4ab0-b751-4d76e2138fff","Type":"ContainerDied","Data":"06cf864a8636b76b5e45d8a09acc6bf3b1e117f48297d84e159154fc088a8227"} Apr 16 15:24:13.077251 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077217 2580 scope.go:117] "RemoveContainer" containerID="3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034" Apr 16 15:24:13.077251 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077149 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" Apr 16 15:24:13.077517 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077490 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077522 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077542 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fww2m\" (UniqueName: \"kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077576 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077809 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077644 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077809 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077752 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache\") pod \"a01539b4-0e53-4ab0-b751-4d76e2138fff\" (UID: \"a01539b4-0e53-4ab0-b751-4d76e2138fff\") " Apr 16 15:24:13.077809 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.077788 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home" (OuterVolumeSpecName: "home") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.078305 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.078064 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache" (OuterVolumeSpecName: "model-cache") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.078305 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.078108 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.080311 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.080286 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:24:13.080427 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.080387 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm" (OuterVolumeSpecName: "dshm") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.081201 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.081166 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m" (OuterVolumeSpecName: "kube-api-access-fww2m") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "kube-api-access-fww2m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:24:13.097953 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.097917 2580 scope.go:117] "RemoveContainer" containerID="0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6" Apr 16 15:24:13.149760 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.149708 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a01539b4-0e53-4ab0-b751-4d76e2138fff" (UID: "a01539b4-0e53-4ab0-b751-4d76e2138fff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:24:13.163006 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.162984 2580 scope.go:117] "RemoveContainer" containerID="3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034" Apr 16 15:24:13.163370 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:24:13.163345 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034\": container with ID starting with 3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034 not found: ID does not exist" containerID="3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034" Apr 16 15:24:13.163486 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.163377 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034"} err="failed to get container status \"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034\": rpc error: code = NotFound desc = could not find container \"3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034\": container with ID starting with 3e9db5db1b007d5a65c5fc394754c1d6a63058d35a71e885271e8e696de00034 not found: ID does not exist" Apr 16 15:24:13.163486 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.163398 2580 scope.go:117] "RemoveContainer" containerID="0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6" Apr 16 15:24:13.163663 ip-10-0-137-160 kubenswrapper[2580]: E0416 15:24:13.163646 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6\": container with ID starting with 0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6 not found: ID does not exist" containerID="0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6" Apr 16 15:24:13.163731 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.163670 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6"} err="failed to get container status \"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6\": rpc error: code = NotFound desc = could not find container \"0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6\": container with ID starting with 0ab8b4b0684e169af454f1b05c014b55b948a83aad90ba4c3cb76c1a0a6f7bb6 not found: ID does not exist" Apr 16 15:24:13.178599 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.178577 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a01539b4-0e53-4ab0-b751-4d76e2138fff-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.178719 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.178604 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.178719 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.178616 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.178719 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.178627 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fww2m\" (UniqueName: \"kubernetes.io/projected/a01539b4-0e53-4ab0-b751-4d76e2138fff-kube-api-access-fww2m\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.178719 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.178636 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a01539b4-0e53-4ab0-b751-4d76e2138fff-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:24:13.401319 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.401292 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:24:13.404493 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:13.404470 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4"] Apr 16 15:24:14.935865 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:14.935807 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" path="/var/lib/kubelet/pods/a01539b4-0e53-4ab0-b751-4d76e2138fff/volumes" Apr 16 15:24:17.900473 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:17.900423 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7d994b55cd-f6rm4" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": context deadline exceeded" Apr 16 15:24:18.769987 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:18.769929 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:24:28.769903 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:28.769825 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:24:38.769666 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:38.769619 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:24:48.769194 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:48.769142 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:24:58.769141 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:24:58.769093 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:25:08.769219 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:08.769160 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8001/health\": dial tcp 10.134.0.35:8001: connect: connection refused" Apr 16 15:25:18.779882 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:18.779832 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:25:18.796690 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:18.796667 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:25:30.223058 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:30.223027 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:25:30.223586 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:30.223384 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" containerID="cri-o://460d21608b0a2ba41ad6ad48be0e41ece1badff4ec077883d7875a95a509860e" gracePeriod=30 Apr 16 15:25:45.470800 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:45.470769 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:45.495736 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:45.495715 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:45.501681 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:45.501659 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:45.511445 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:45.511408 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:46.482927 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:46.482900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:46.504330 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:46.504308 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:46.510867 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:46.510834 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:46.520103 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:46.520086 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:47.470423 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:47.470393 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:47.489945 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:47.489919 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:47.495635 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:47.495605 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:47.504834 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:47.504815 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:48.452944 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:48.452906 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:48.474266 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:48.474245 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:48.480049 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:48.480031 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:48.488714 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:48.488695 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:49.411797 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:49.411764 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:49.434464 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:49.434437 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:49.444557 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:49.444538 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:49.457739 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:49.457724 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:50.385605 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:50.385578 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:50.405568 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:50.405540 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:50.411869 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:50.411832 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:50.421751 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:50.421728 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:51.427513 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:51.427488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:51.447691 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:51.447662 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:51.454317 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:51.454299 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:51.465275 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:51.465257 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:52.414800 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:52.414771 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:52.437598 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:52.437570 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:52.443342 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:52.443324 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:52.454460 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:52.454439 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:53.383125 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:53.383100 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:53.406466 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:53.406441 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:53.413405 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:53.413390 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:53.428160 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:53.428143 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:54.372571 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:54.372538 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:54.393100 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:54.393066 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:54.399170 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:54.399149 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:54.410567 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:54.410539 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:55.351456 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:55.351425 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:55.373440 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:55.373416 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:55.381639 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:55.381620 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:55.396075 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:55.396052 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:56.355417 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:56.355393 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:56.376986 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:56.376961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:56.384661 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:56.384640 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:56.395394 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:56.395377 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:57.421969 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:57.421941 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:57.445184 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:57.445160 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:57.453964 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:57.453940 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:57.465607 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:57.465586 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:25:58.459082 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:58.459057 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-n92d5_2745d706-4614-4168-b260-c4d530c239c2/istio-proxy/0.log" Apr 16 15:25:58.478534 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:58.478512 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:25:58.487757 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:58.487737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/llm-d-routing-sidecar/0.log" Apr 16 15:25:58.498171 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:25:58.498151 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/storage-initializer/0.log" Apr 16 15:26:00.224197 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.224160 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="llm-d-routing-sidecar" containerID="cri-o://550399b15cc16945adc38d13de6a66d8e792ec359e792136a2f40bde2efbee8b" gracePeriod=2 Apr 16 15:26:00.456120 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456100 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:26:00.456821 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456795 2580 generic.go:358] "Generic (PLEG): container finished" podID="8d20d917-d834-4255-be95-1c2f55a8017a" containerID="460d21608b0a2ba41ad6ad48be0e41ece1badff4ec077883d7875a95a509860e" exitCode=137 Apr 16 15:26:00.456821 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456821 2580 generic.go:358] "Generic (PLEG): container finished" podID="8d20d917-d834-4255-be95-1c2f55a8017a" containerID="550399b15cc16945adc38d13de6a66d8e792ec359e792136a2f40bde2efbee8b" exitCode=0 Apr 16 15:26:00.457001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456878 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerDied","Data":"460d21608b0a2ba41ad6ad48be0e41ece1badff4ec077883d7875a95a509860e"} Apr 16 15:26:00.457001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456914 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerDied","Data":"550399b15cc16945adc38d13de6a66d8e792ec359e792136a2f40bde2efbee8b"} Apr 16 15:26:00.457001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456931 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" event={"ID":"8d20d917-d834-4255-be95-1c2f55a8017a","Type":"ContainerDied","Data":"c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039"} Apr 16 15:26:00.457001 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.456945 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c22218c5556059c7791549261b75079ff4d672e94c1fc48c2c952f132745d039" Apr 16 15:26:00.469558 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.469539 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-86d9dd65cc-h96dg_8d20d917-d834-4255-be95-1c2f55a8017a/main/0.log" Apr 16 15:26:00.470206 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.470193 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:26:00.622313 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622287 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622484 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622345 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622484 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622378 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622484 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622419 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjldv\" (UniqueName: \"kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622484 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622457 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622705 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622487 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs\") pod \"8d20d917-d834-4255-be95-1c2f55a8017a\" (UID: \"8d20d917-d834-4255-be95-1c2f55a8017a\") " Apr 16 15:26:00.622781 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622742 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache" (OuterVolumeSpecName: "model-cache") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.622835 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.622781 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home" (OuterVolumeSpecName: "home") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.624758 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.624723 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm" (OuterVolumeSpecName: "dshm") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.624901 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.624861 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv" (OuterVolumeSpecName: "kube-api-access-pjldv") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "kube-api-access-pjldv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:26:00.624976 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.624910 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 15:26:00.677191 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.677161 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d20d917-d834-4255-be95-1c2f55a8017a" (UID: "8d20d917-d834-4255-be95-1c2f55a8017a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:26:00.723653 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723624 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-home\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.723653 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723649 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-dshm\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.723826 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723662 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjldv\" (UniqueName: \"kubernetes.io/projected/8d20d917-d834-4255-be95-1c2f55a8017a-kube-api-access-pjldv\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.723826 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723676 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-model-cache\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.723826 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723689 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8d20d917-d834-4255-be95-1c2f55a8017a-tls-certs\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:00.723826 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:00.723700 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d20d917-d834-4255-be95-1c2f55a8017a-kserve-provision-location\") on node \"ip-10-0-137-160.ec2.internal\" DevicePath \"\"" Apr 16 15:26:01.028685 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.028660 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-c2x8q_ab3b5db3-14b3-4d11-9a61-e95b243cb02b/kuadrant-console-plugin/0.log" Apr 16 15:26:01.077928 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.077901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-fzj2d_4ff8b8dd-6ce4-4a06-8131-0e00309c7be2/limitador/0.log" Apr 16 15:26:01.093733 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.093707 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-hm2tg_5fd777a3-efab-49b1-a04c-c04fc56165de/manager/0.log" Apr 16 15:26:01.459938 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.459911 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg" Apr 16 15:26:01.477701 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.477673 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:26:01.481647 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:01.481622 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-86d9dd65cc-h96dg"] Apr 16 15:26:02.935658 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:02.935625 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" path="/var/lib/kubelet/pods/8d20d917-d834-4255-be95-1c2f55a8017a/volumes" Apr 16 15:26:06.331640 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:06.331585 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-js8tx_e6782489-2452-4e2c-9e5c-ff11be2380fe/global-pull-secret-syncer/0.log" Apr 16 15:26:06.426727 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:06.426699 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9dmjs_9613bf50-dbc0-4e6d-aeba-8f63da3babdb/konnectivity-agent/0.log" Apr 16 15:26:06.472408 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:06.472385 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-160.ec2.internal_0cc6f13ece139d4b95b6457b753b2eb6/haproxy/0.log" Apr 16 15:26:10.727527 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:10.727443 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-c2x8q_ab3b5db3-14b3-4d11-9a61-e95b243cb02b/kuadrant-console-plugin/0.log" Apr 16 15:26:10.795786 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:10.795758 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-fzj2d_4ff8b8dd-6ce4-4a06-8131-0e00309c7be2/limitador/0.log" Apr 16 15:26:10.819698 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:10.819651 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-hm2tg_5fd777a3-efab-49b1-a04c-c04fc56165de/manager/0.log" Apr 16 15:26:11.865804 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:11.865759 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-4v4pn_d6202eab-2202-486c-9663-9a51687b0dc8/cluster-monitoring-operator/0.log" Apr 16 15:26:11.887029 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:11.887010 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-s8gz9_90d7f787-82ab-4219-b168-822d1a382c4e/kube-state-metrics/0.log" Apr 16 15:26:11.904790 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:11.904766 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-s8gz9_90d7f787-82ab-4219-b168-822d1a382c4e/kube-rbac-proxy-main/0.log" Apr 16 15:26:11.923072 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:11.923049 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-s8gz9_90d7f787-82ab-4219-b168-822d1a382c4e/kube-rbac-proxy-self/0.log" Apr 16 15:26:12.082668 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.082644 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lhvl9_9a6c274c-eb63-47f4-95cb-022efb3ff364/node-exporter/0.log" Apr 16 15:26:12.100114 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.100090 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lhvl9_9a6c274c-eb63-47f4-95cb-022efb3ff364/kube-rbac-proxy/0.log" Apr 16 15:26:12.120763 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.120703 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lhvl9_9a6c274c-eb63-47f4-95cb-022efb3ff364/init-textfile/0.log" Apr 16 15:26:12.301408 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.301382 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/prometheus/0.log" Apr 16 15:26:12.316806 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.316784 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/config-reloader/0.log" Apr 16 15:26:12.341371 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.341341 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/thanos-sidecar/0.log" Apr 16 15:26:12.368544 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.368524 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/kube-rbac-proxy-web/0.log" Apr 16 15:26:12.408965 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.408917 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/kube-rbac-proxy/0.log" Apr 16 15:26:12.440940 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.440915 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/kube-rbac-proxy-thanos/0.log" Apr 16 15:26:12.459600 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.459584 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6ba0d8cd-bc51-43f5-843d-a9ab7a40000c/init-config-reloader/0.log" Apr 16 15:26:12.495713 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.495691 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-pp9nl_db42dd70-d787-44af-bbaa-54821221e8eb/prometheus-operator/0.log" Apr 16 15:26:12.515717 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:12.515698 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-pp9nl_db42dd70-d787-44af-bbaa-54821221e8eb/kube-rbac-proxy/0.log" Apr 16 15:26:14.609296 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:14.609267 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/1.log" Apr 16 15:26:14.614143 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:14.614094 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-jxm62_f5d0644e-6880-4f5d-8d37-6b6693b0bfea/console-operator/2.log" Apr 16 15:26:15.116481 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.116458 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-b8wv8_b9c6c00d-c7ce-49a6-8ad8-1af23b6a7f66/download-server/0.log" Apr 16 15:26:15.496419 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.496342 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9"] Apr 16 15:26:15.496976 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.496952 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="storage-initializer" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.496979 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="storage-initializer" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497002 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497011 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497082 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="storage-initializer" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497093 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="storage-initializer" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497106 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="llm-d-routing-sidecar" Apr 16 15:26:15.497113 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497116 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="llm-d-routing-sidecar" Apr 16 15:26:15.497459 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497142 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="main" Apr 16 15:26:15.497459 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497150 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="main" Apr 16 15:26:15.497459 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497231 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="llm-d-routing-sidecar" Apr 16 15:26:15.497459 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497245 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d20d917-d834-4255-be95-1c2f55a8017a" containerName="main" Apr 16 15:26:15.497459 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.497256 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="a01539b4-0e53-4ab0-b751-4d76e2138fff" containerName="main" Apr 16 15:26:15.500122 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.500104 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.502214 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.502197 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"openshift-service-ca.crt\"" Apr 16 15:26:15.503088 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.503053 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f5p4n\"/\"default-dockercfg-fv9wn\"" Apr 16 15:26:15.503202 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.503095 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f5p4n\"/\"kube-root-ca.crt\"" Apr 16 15:26:15.510688 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.510665 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9"] Apr 16 15:26:15.545350 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.545328 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-sys\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.545498 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.545422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bnzz\" (UniqueName: \"kubernetes.io/projected/75effc27-3d6a-4cce-9bc5-7081bbbe8288-kube-api-access-6bnzz\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.545498 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.545467 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-proc\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.545498 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.545491 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-lib-modules\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.545637 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.545533 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-podres\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.560288 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.560268 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-hh9lf_7221e3d8-8a94-4e43-88a3-0261e15e31c2/volume-data-source-validator/0.log" Apr 16 15:26:15.646060 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-podres\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646075 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-sys\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646133 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bnzz\" (UniqueName: \"kubernetes.io/projected/75effc27-3d6a-4cce-9bc5-7081bbbe8288-kube-api-access-6bnzz\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-proc\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646190 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-lib-modules\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646194 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-podres\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646222 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-sys\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646259 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-proc\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.646403 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.646305 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/75effc27-3d6a-4cce-9bc5-7081bbbe8288-lib-modules\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.654722 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.654698 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bnzz\" (UniqueName: \"kubernetes.io/projected/75effc27-3d6a-4cce-9bc5-7081bbbe8288-kube-api-access-6bnzz\") pod \"perf-node-gather-daemonset-dskh9\" (UID: \"75effc27-3d6a-4cce-9bc5-7081bbbe8288\") " pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.813051 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.812964 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:15.930910 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.930807 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9"] Apr 16 15:26:15.933560 ip-10-0-137-160 kubenswrapper[2580]: W0416 15:26:15.933519 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod75effc27_3d6a_4cce_9bc5_7081bbbe8288.slice/crio-a14bae2708de926a604606c08d6dd7923c9d761c3475b3f4409b4e681bde6239 WatchSource:0}: Error finding container a14bae2708de926a604606c08d6dd7923c9d761c3475b3f4409b4e681bde6239: Status 404 returned error can't find the container with id a14bae2708de926a604606c08d6dd7923c9d761c3475b3f4409b4e681bde6239 Apr 16 15:26:15.935110 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:15.935095 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:26:16.255922 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.255901 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dlgsh_89bc5259-a854-4a23-908c-c4af285bd699/dns/0.log" Apr 16 15:26:16.273425 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.273409 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dlgsh_89bc5259-a854-4a23-908c-c4af285bd699/kube-rbac-proxy/0.log" Apr 16 15:26:16.423491 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.423469 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qswpg_48dde1ba-8911-4c19-9083-79bd3339f3bf/dns-node-resolver/0.log" Apr 16 15:26:16.510005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.509936 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" event={"ID":"75effc27-3d6a-4cce-9bc5-7081bbbe8288","Type":"ContainerStarted","Data":"c7c950808c73a2fd4ef452395f4c1c9856c1d1fcd79ee23aee8a167be93b8e72"} Apr 16 15:26:16.510005 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.509966 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" event={"ID":"75effc27-3d6a-4cce-9bc5-7081bbbe8288","Type":"ContainerStarted","Data":"a14bae2708de926a604606c08d6dd7923c9d761c3475b3f4409b4e681bde6239"} Apr 16 15:26:16.510164 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.510081 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:16.527068 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.527025 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" podStartSLOduration=1.527013588 podStartE2EDuration="1.527013588s" podCreationTimestamp="2026-04-16 15:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:26:16.524536931 +0000 UTC m=+2004.177919304" watchObservedRunningTime="2026-04-16 15:26:16.527013588 +0000 UTC m=+2004.180395988" Apr 16 15:26:16.886350 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.886320 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-594f8c9465-vncdh_e1c716fe-66ae-400c-a328-11d7504d5480/registry/0.log" Apr 16 15:26:16.922768 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:16.922744 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qpspz_9ab7c27b-be98-41cf-bbea-2ed5ab71d83f/node-ca/0.log" Apr 16 15:26:18.239022 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.238987 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lstqc_a1c49e18-2e18-4c91-9c90-58bd53f03775/serve-healthcheck-canary/0.log" Apr 16 15:26:18.627687 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.627665 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-tx72c_9915bbf3-08d3-4eb4-b977-389f37e66425/insights-operator/0.log" Apr 16 15:26:18.627931 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.627908 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-tx72c_9915bbf3-08d3-4eb4-b977-389f37e66425/insights-operator/1.log" Apr 16 15:26:18.645280 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.645260 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8j958_9b8ad91f-8e14-47f4-b2ae-495edea3e670/kube-rbac-proxy/0.log" Apr 16 15:26:18.665566 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.665546 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8j958_9b8ad91f-8e14-47f4-b2ae-495edea3e670/exporter/0.log" Apr 16 15:26:18.684327 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:18.684307 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8j958_9b8ad91f-8e14-47f4-b2ae-495edea3e670/extractor/0.log" Apr 16 15:26:21.310488 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:21.310437 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bfb495c77-8964n_e7415c4a-0b15-4a23-a1f7-024a2d5b2d66/manager/0.log" Apr 16 15:26:21.865969 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:21.865944 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-5c55c5c99-b46jq_9488bf4c-5b0d-46e2-ba34-04456010e7af/manager/0.log" Apr 16 15:26:22.524288 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:22.524256 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f5p4n/perf-node-gather-daemonset-dskh9" Apr 16 15:26:27.023531 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:27.023495 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-pfnkn_2ff993dc-1d95-4aaf-b8a3-233fbf6081af/kube-storage-version-migrator-operator/1.log" Apr 16 15:26:27.024420 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:27.024403 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-pfnkn_2ff993dc-1d95-4aaf-b8a3-233fbf6081af/kube-storage-version-migrator-operator/0.log" Apr 16 15:26:27.993451 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:27.993426 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-68rx2_475f512a-706c-424b-b38f-428bf1b64f69/kube-multus/0.log" Apr 16 15:26:28.307110 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.307075 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/kube-multus-additional-cni-plugins/0.log" Apr 16 15:26:28.328646 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.328624 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/egress-router-binary-copy/0.log" Apr 16 15:26:28.351622 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.351606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/cni-plugins/0.log" Apr 16 15:26:28.373150 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.373132 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/bond-cni-plugin/0.log" Apr 16 15:26:28.392645 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.392626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/routeoverride-cni/0.log" Apr 16 15:26:28.410644 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.410626 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/whereabouts-cni-bincopy/0.log" Apr 16 15:26:28.432907 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.432889 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w4dhd_4c7680da-cb3c-4ad2-b143-8ff457f88efe/whereabouts-cni/0.log" Apr 16 15:26:28.514082 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.514022 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kgs47_5c30c303-f0bf-425c-bb3f-ce75dde11fe3/network-metrics-daemon/0.log" Apr 16 15:26:28.531026 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:28.531006 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kgs47_5c30c303-f0bf-425c-bb3f-ce75dde11fe3/kube-rbac-proxy/0.log" Apr 16 15:26:29.623089 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.623060 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-controller/0.log" Apr 16 15:26:29.639396 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.639372 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/0.log" Apr 16 15:26:29.648246 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.648223 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovn-acl-logging/1.log" Apr 16 15:26:29.665701 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.665680 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/kube-rbac-proxy-node/0.log" Apr 16 15:26:29.684728 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.684702 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:26:29.701730 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.701710 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/northd/0.log" Apr 16 15:26:29.720563 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.720548 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/nbdb/0.log" Apr 16 15:26:29.738639 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.738608 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/sbdb/0.log" Apr 16 15:26:29.832003 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:29.831978 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-czzzx_8e0c1234-5484-4c8c-9e5f-c0d64478ef21/ovnkube-controller/0.log" Apr 16 15:26:31.416938 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:31.416910 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wvkzh_f33dcc16-237a-4c31-aca0-f46c9648fc20/network-check-target-container/0.log" Apr 16 15:26:32.364109 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:32.364084 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-5dfrs_70c41107-b96d-429c-a82c-270215f0994f/iptables-alerter/0.log" Apr 16 15:26:33.192886 ip-10-0-137-160 kubenswrapper[2580]: I0416 15:26:33.192864 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bs8ms_b533746b-70b1-42cc-ab44-8b3907cf75a3/tuned/0.log"